The need to reach multiple platforms and consumer electronics devices has been a constant challenge for content creators, not without significant costs and complication. Enter MPEG-DASH.
Moving Picture Expert Group (MPEG) has developed several widely used multimedia standards, including MPEG-2, MPEG-4, MPEG-7, MPEG-21. Their latest standard to be introduced, Dynamic Adaptive Streaming over HTTP, or MPEG-DASH, is an attempt to solve the complexities of media delivery to multiple devices with a unified common standard.
The first major trial utilizing MPEG-DASH occurred in 2012 for the London Olympics. Vlaamse Radio- en Televisieomroeporganisatie (VRT) offered its audience the chance to experience the Olympic Games broadcast on their personal devices via the MPEG-DASH to demonstrate the benefits of the standard for adaptive streaming. In addition, many of the major players in the video space are offering support for the new standard.
Encoding.com wanted to provide an overview of this new standard and give a background of the history of MPEG-DASH, how it works, its features, and its benefits.
As we know, adaptive bitrate streaming has become the standard for delivering video content online to multiple devices. This type of delivery is a combination of server and client software that detects a client’s bandwidth capacity and adjusts the quality of the video stream between multiple bitrates and/or resolutions. The adaptive bitrate video experience is superior to delivering a static video file at a single bitrate, because the video stream can be switched midstream to be as good or bad as the client’s available network speed (as opposed to the buffering or interruption in playback that can happen when client’s network speed can’t support the quality of video). Because it uses the standard HTTP port, the lack of firewalls, special proxies or caches, and its cost efficiency have increased its popularity and use. There are three main protocols for this type of delivery- HTTP Live Streaming, Microsoft Smooth Streaming, and HTTP Dynamic Streaming. Each protocol uses different methods and formats, and therefore, to receive the content from each server, a device must support each protocol. A standard for HTTP streaming of multimedia content would allow a standard-based client to stream content from any standard-based server, thereby enabling consistent playback and unification of servers and clients of different vendors.
In response to the scattered landscape, MPEG issued a Call for Proposal for an HTTP streaming standard in April 2009. In the two years that followed, MPEG developed the specification with participation from many experts and with collaboration from other standard groups, such as the Third Generation Partnership Project (3GPP). More than 50 companies were involved — Microsoft, Netflix, and Adobe included — and the effort was coordinated with other industry organizations such as studio-backed digital locker initiator Digital Entertainment Content Ecosystem, LLC (DECE), OIPF, and World Wide Web Consortium (W3C). This resulted in the MPEG-DASH standard being developed.
The image below illustrates a simple streaming scenario between an HTTP server and a DASH client. In this figure, the multimedia content is captured and stored on an HTTP server and is delivered using HTTP. The content exists on the server in two parts: Media Presentation Description (MPD), which describes a manifest of the available content, its various alternatives, their URL addresses, and other characteristics; and segments, which contain the actual multimedia bitstreams in the form of chunks, in single or multiple files.
To play the content, the DASH client first obtains the MPD. The MPD can be delivered using HTTP, email, thumb drive, broadcast, or other transports. By parsing the MPD, the DASH client learns about the program timing, media-content availability, media types, resolutions, minimum and maximum bandwidths, and the existence of various encoded alternatives of multimedia components, accessibility features and required digital rights management (DRM), media-component locations on the network, and other content characteristics. Using this information, the DASH client selects the appropriate encoded alternative and starts streaming the content by fetching the segments using HTTP GET requests.
After appropriate buffering to allow for network throughput variations, the client continues fetching the subsequent segments and also monitors the network bandwidth fluctuations. Depending on its measurements, the client decides how to adapt to the available bandwidth by fetching segments of different alternatives (with lower or higher bitrates) to maintain an adequate buffer.
The MPEG-DASH specification only defines the MPD and the segment formats. The delivery of the MPD and the media encoding formats containing the segments, as well as the client behavior for fetching, adaptation heuristics, and playing content, are outside of MPEG-DASH’s scope.
There are several key benefits in the adaption of this new standard. Due to the fact that several major media companies took part in its development, the new protocol will eliminate technical issues in delivery and compression. In essence, it aims to combine all of the technologies and standards into one, making streaming support seamless on all devices. In turn, it aims to reduce technical headaches and transcoding costs. Content publishers can generate a single set of files for encoding and streaming that should be compatible with as many devices as possible, from mobile to OTT, as well as to the desktop via plug-ins or HTML5. Consumers will not have to worry about whether their devices will be able to play the content they want to watch.
As mentioned previously, there are several major backers of the standard; some of which include Adobe, Samsung, Microsoft, Dolby, Netflix and Cisco. Google also supports it on higher res video on YouTube and Chrome as well. The OTT space has been the major driver in adaption. For more information on the latest in DASH deployment click here.
For DRM EME uses a Key System instead of what is traditionally referred to as a DRM scheme. They Key System is the component of the CDM that handles encryption and decryption of content. In the case of the web browser, each browser would have it’s own unique key system, which ensures cross browser compatibility.
What makes EME unique is that it is not tied to any specific DRM scheme. The CDM can accommodate whatever scheme the content distributor prefers. This encourages both cross-platform compatibility and improved playback experience. Whether decrypting and serving, encrypting and decrypting, or bypass the CPU by using the GPU, the more the CDM handles the more secure the content remains in the end-to-end workflow. Essentially the more the CDM manages the content-protection process, the less the user can interfere and potentially violate content copyrights. In some cases the copyright holders will places stricter controls on HD vs. SD content. CDM is compatible with the following container formats: MP4/H.264 & WebM/VP8. Encryption can happen both within and without the container, but generally happens within the container. However the beauty of the CDM being abstracted means the container/codec combos can work with any number of encryption schemes.
Below is an explanation of the EME workflow laid out in the diagram above.
With the obvious benefits of a unified standard for delivery of video to multiple devices utilizing adaptive bitrate protocols, Encoding.com is now offering support for MPEG-DASH in both the UI and API. Refer to the information below on how they are used with each integration method.For our sample XML, see below:
<!-- Format fields -->
<!-- Destination fields -->
<destination>[DestFile]</destination> <!-- REQUIRED -->
Encoding.com continues its quest to migrate large-scale video pipelines to the Cloud.
Joint Solution Features Viaccess-Orca’s Cloud-Based TVaaS, With Cloud Media Processing From Encoding.com
RT @OpineMediaGroup : OpinePR 🗣 RT OpinePR: OpinePR 🗣️ https://t.co/rlwV1ApTvZ announcing the launch of the first Video QC Service in t… htt…5 hours ago
Video QC in the Cloud is here https://t.co/uueIFbcXzz #VideoQC #Cloud #Encoding https://t.co/qh3jt02oWP9 hours ago