What is difference between fmp4 and mp4 in streaming ?which is more beneficial as compared to .ts ?
Both are supported by DASH ?
Happy to help explain, but to understand why fmp4 is a big deal, you’ll need to understand CMAF first. What if competing technology providers could agree on a standard streaming format across all playback platforms? Enter the Common Media Application Format, or CMAF.
The idea was that CMAF would simplify streaming on all platforms with one common container that work across all platforms and devices, rather than having to stream DASH for some of your audience and stream HLS for the other half of your viewers. Now, you could just use one container for your streams and that’s it.
Now the next thing to consider: CMAF represents a coordinated industry-wide effort to lower latency with chunked encoding. This process involves breaking the video into smaller chunks or fragments of a set duration, which can then be immediately published upon encoding.
That is where fmp4 comes. That stands for fragmented mp4 and while in the past it was only supported by DASH, it was announced in July of 2016 that Apple would support fmp4 as well in HLS.
Microsoft and Apple have now agreed to reach audiences across the HLS and DASH protocols using a standardized transport container — in the form of fragmented MP4 or more commonly known as fmp4. Theoretically, this means it will be easier to stream to a greater audience by using one container.
Keep in mind also that fmp4 is required for HTML5 players that use MSE: Media Source Extension MSE needs fragmented mp4 for playback in the browser.
You can read some of our blogs on the subject, but basically fmpeg is smaller bytes and is used in CMAF streaming. It’s for lower latency streaming.
This article explains the difference between fmp4 and normal mp4: