FFmpeg HLS & DASH Streaming: Pick the Right Packager
Compare FFmpeg, Bento4, and Shaka Packager for HLS/DASH adaptive bitrate streaming. A decision framework with code examples to ship the right pipeline fast.
HLS and DASH streaming packaging is the process of splitting encoded video into segmented files with manifests (.m3u8 for HLS, .mpd for DASH) that adaptive bitrate players use to switch quality based on viewer bandwidth. FFmpeg handles both encoding and basic packaging, while dedicated tools like Bento4 and Shaka Packager offer production-grade manifest generation, CMAF dual-output, and DRM encryption that FFmpeg cannot match natively.
Key Takeaways
- FFmpeg = transcoder + basic packager. Bento4/Shaka = dedicated packagers (no transcoding)
- Use FFmpeg alone for simple HLS, internal tools, or live streaming
- Use Shaka Packager for DRM (Widevine/PlayReady) or CMAF (HLS+DASH from one segment set)
- Use Bento4 for fastest packaging speed with auto-metadata injection
- Skip infrastructure entirely with an FFmpeg API like ffpipe
You need adaptive bitrate streaming. You’ve got FFmpeg installed. Now every tutorial tells you something different — some use FFmpeg alone, some chain it with Bento4, others bring in Shaka Packager. The Reddit threads are full of conflicting advice, and you’re stuck comparing tools instead of shipping video.
Here’s the problem: FFmpeg is a transcoder that also does basic packaging. Bento4 and Shaka Packager are dedicated packagers that don’t transcode. Mixing up these roles is where most developers lose time. This guide gives you a concrete decision framework for building your DASH streaming and adaptive bitrate streaming pipeline — with working commands you can run today.
The Encode vs. Package Distinction
Every video transcoding pipeline has two stages:
- Encode — Convert source video into multiple renditions (1080p, 720p, 480p, 360p) with aligned keyframes.
- Package — Take those renditions and produce manifests (
.m3u8for HLS,.mpdfor DASH) plus segmented files that players consume.
FFmpeg can do both. Dedicated packagers (Bento4, Shaka) only do step 2. The question isn’t “which tool?” — it’s “do I need the separation?”
When one tool is enough: Simple HLS output, no DRM, no CMAF, limited renditions.
When you need two: Multi-DRM, simultaneous HLS+DASH from one segment set, production-grade manifests with auto-detected codec metadata.
FFmpeg for HLS and DASH — Strengths and Limits
FFmpeg handles HLS packaging natively. Here’s a multi-bitrate HLS encode in a single command:
ffmpeg -i input.mp4 \
-filter_complex "[0:v]split=3[v1][v2][v3]; \
[v1]scale=1920:1080[v1out]; \
[v2]scale=1280:720[v2out]; \
[v3]scale=854:480[v3out]" \
-map "[v1out]" -c:v libx264 -b:v 5000k -maxrate 5350k -bufsize 7500k \
-g 48 -keyint_min 48 -sc_threshold 0 \
-map "[v2out]" -c:v libx264 -b:v 2800k -maxrate 2996k -bufsize 4200k \
-g 48 -keyint_min 48 -sc_threshold 0 \
-map "[v3out]" -c:v libx264 -b:v 1400k -maxrate 1498k -bufsize 2100k \
-g 48 -keyint_min 48 -sc_threshold 0 \
-map 0:a -c:a aac -b:a 128k -ac 2 \
-f hls \
-hls_time 4 \
-hls_playlist_type vod \
-hls_flags independent_segments \
-master_pl_name master.m3u8 \
-var_stream_map "v:0,a:0 v:1,a:0 v:2,a:0" \
stream_%v/playlist.m3u8
Critical detail: -g 48 -keyint_min 48 -sc_threshold 0 forces a keyframe every 48 frames. At 24fps, that’s a 2-second GOP — matching your -hls_time 4 segment duration. Without aligned keyframes across renditions, players can’t switch quality at segment boundaries. This breaks adaptive streaming silently.
FFmpeg also outputs DASH directly:
ffmpeg -i input.mp4 \
-map 0:v -c:v libx264 -b:v 2800k -g 48 -keyint_min 48 -sc_threshold 0 \
-map 0:a -c:a aac -b:a 128k \
-f dash \
-seg_duration 4 \
-use_timeline 1 \
-use_template 1 \
output.mpd
Where FFmpeg falls short:
- No automatic bandwidth/codec metadata injection into manifests — you hardcode or script it yourself.
- No native multi-DRM support (Widevine, PlayReady).
- No simultaneous HLS + DASH output from the same fMP4 segments (CMAF).
- Manifest generation is basic compared to dedicated packagers.
For a side project or internal tool, FFmpeg alone works fine. For production VOD with device diversity, you’ll hit these walls.
Bento4 vs Shaka Packager — Dedicated Packaging Tools
Both consume pre-encoded MP4 renditions from FFmpeg. The workflow looks like this:
Source → FFmpeg (encode renditions) → MP4 files → Packager (manifests + segments)
Bento4
Bento4’s mp4dash and mp4hls are C++ tools optimized for speed. Key advantages:
- Faster than FFmpeg or Shaka at producing HLS/DASH assets from pre-encoded files.
- Auto-gathers metadata (codecs, bandwidth, resolution) during segmentation and injects it into manifests — no manual
BANDWIDTHtagging. - Full DASH option set including multi-period, subtitles, and trick play.
# Encode renditions with FFmpeg first
ffmpeg -i input.mp4 -c:v libx264 -b:v 5000k -g 48 -keyint_min 48 \
-sc_threshold 0 -an output_1080p.mp4
ffmpeg -i input.mp4 -c:v libx264 -b:v 2800k -g 48 -keyint_min 48 \
-sc_threshold 0 -vf scale=1280:720 -an output_720p.mp4
# Package with Bento4
mp4dash --hls output_1080p.mp4 output_720p.mp4 -o output/
Shaka Packager
Shaka Packager (by Google) is the go-to when you need DRM or CMAF:
- Simultaneous HLS + DASH from one set of CMAF fMP4 segments.
- Built-in Widevine and PlayReady encryption — no additional tooling.
- Active development and Google’s production backing.
packager \
in=output_1080p.mp4,stream=video,output=h264_1080p.mp4,playlist_name=h264_1080p.m3u8 \
in=output_720p.mp4,stream=video,output=h264_720p.mp4,playlist_name=h264_720p.m3u8 \
in=output_1080p.mp4,stream=audio,output=audio.mp4,playlist_name=audio.m3u8,hls_group_id=audio \
--hls_master_playlist_output master.m3u8 \
--mpd_output manifest.mpd
That single command produces both an HLS master playlist and a DASH MPD referencing the same segments.
Decision Matrix
| Requirement | FFmpeg Only | Bento4 | Shaka Packager |
|---|---|---|---|
| Basic HLS output | ✅ | ✅ | ✅ |
| Basic DASH output | ✅ | ✅ | ✅ |
| CMAF (shared segments) | ❌ | Partial | ✅ |
| Widevine/PlayReady DRM | ❌ | ❌ | ✅ |
| Auto manifest metadata | ❌ | ✅ | ✅ |
| Packaging speed | Slower | Fastest | Fast |
| Live ingest | ✅ | ❌ | Limited |
CMAF Packaging: Serve HLS and DASH from One Asset
CMAF (Common Media Application Format) uses fMP4 segments that both HLS and DASH can reference. Without CMAF, you maintain two segment trees: .ts files for HLS and .m4s files for DASH. That doubles your storage and CDN costs.
With Shaka Packager’s CMAF output, one set of .mp4 segments gets referenced by both master.m3u8 and manifest.mpd. For a platform serving both Apple devices (HLS) and Android/web (DASH), this cuts segment storage roughly in half.
FFmpeg can’t produce CMAF dual-manifest output natively. Bento4 has partial support. Shaka Packager handles it cleanly.
Building Your Pipeline — Decision Framework by Use Case
VOD platform, multi-CDN delivery: FFmpeg encodes → Shaka Packager for CMAF HLS+DASH → CDN. Minimizes storage, maximizes device reach.
DRM-protected premium content: Shaka Packager is the only option here without bolting on additional encryption tools. Widevine and PlayReady integration is built in.
Live streaming / LL-HLS:
FFmpeg handles ingest and real-time encoding with -hls_time 2 -hls_flags delete_segments+append_list. Dedicated packagers aren’t designed for live ingest — FFmpeg owns this use case.
Startup shipping a video feature fast: Skip the infrastructure entirely. An FFmpeg API cloud service like ffpipe lets you trigger HLS packaging with an HTTP POST:
curl -X POST https://api.ffpipe.net/v1/convert \
-H "Authorization: Bearer <YOUR_API_KEY>" \
-H "Content-Type: application/json" \
-d '{
"input": "https://storage.example.com/input.mp4",
"preset": "convert-to-hls",
"output": "s3://your-bucket/output/"
}'
No FFmpeg binary management. No server provisioning. No pipeline orchestration code. You get HLS output in your storage bucket, ready for CDN delivery.
Batch UGC processing at scale: Wire ffpipe’s API into a queue (n8n, Make, or a custom worker). Each user upload triggers an API call. Pay-per-use pricing replaces fixed transcoding server costs — you’re not paying for idle GPU capacity at 3 AM.
Next Steps
- If you need control: Set up the two-stage pipeline. Encode with FFmpeg, package with Shaka (DRM/CMAF) or Bento4 (speed/simplicity).
- If you need speed to production: Hit the ffpipe API with a
convert-to-hlspreset and ship adaptive streaming today. - If you’re doing live: Stay with FFmpeg end-to-end — it’s the right tool for real-time ingest and segmentation.
Stop comparing tools in browser tabs. Pick the architecture that matches your constraints — DRM needs, storage budget, team size, time to ship — and build.
Frequently asked questions
What’s the difference between HLS and DASH?
HLS (HTTP Live Streaming) is Apple’s adaptive bitrate protocol using .m3u8 playlists and .ts or fMP4 segments. DASH (Dynamic Adaptive Streaming over HTTP) is an open ISO standard using .mpd manifests and .m4s segments. Both achieve the same goal — adaptive quality switching — but HLS is required for Apple devices while DASH is the open standard for Android and web.
Can FFmpeg generate both HLS and DASH from one command?
No. FFmpeg can output HLS or DASH in a single command, but not both simultaneously from shared segments. For CMAF (one segment set serving both protocols), you need Shaka Packager or partial Bento4 support.
When should I use a dedicated packager instead of FFmpeg alone?
Use a dedicated packager when you need DRM encryption (Widevine/PlayReady), CMAF dual-output (HLS+DASH from shared segments), or production-grade manifests with automatic bandwidth and codec metadata injection. For simple HLS output without DRM, FFmpeg alone is sufficient.
What is CMAF and why does it matter?
CMAF (Common Media Application Format) uses fragmented MP4 segments that both HLS and DASH manifests can reference. Without CMAF, you maintain separate .ts files (HLS) and .m4s files (DASH), roughly doubling your storage and CDN costs.
Can I skip all of this with an API?
Yes. Cloud FFmpeg APIs like ffpipe accept a convert-to-hls preset via HTTP POST and return segmented HLS output in your storage bucket. No FFmpeg binary, no packager installation, no pipeline orchestration code.
Glossary
- Adaptive bitrate streaming (ABR): Delivering video in multiple quality renditions so the player can switch between them based on the viewer’s available bandwidth.
- HLS (HTTP Live Streaming): Apple’s streaming protocol using
.m3u8playlist files and segmented media files. - DASH (Dynamic Adaptive Streaming over HTTP): ISO standard streaming protocol using
.mpdmanifests and.m4ssegmented media. - CMAF (Common Media Application Format): A standard that uses fMP4 segments compatible with both HLS and DASH, eliminating the need for separate segment trees.
- GOP (Group of Pictures): A sequence of video frames between keyframes. Aligned GOPs across renditions are required for smooth ABR quality switching.
- Manifest: A playlist file (
.m3u8or.mpd) that tells the video player where to find each quality level’s segments. - DRM (Digital Rights Management): Encryption systems (Widevine, PlayReady, FairPlay) that protect video content from unauthorized copying.