Until fairly recently, Adobe’s Flash video technology had been the main method of delivering video via the internet. Today, however, there’s a major shift taking place in the world of online video. Over the past decade, Adobe’s Flash protocol has been replaced increasingly by video delivered using protocols like HLS streaming and played in HTML5 video players.
For broadcasters and viewers alike, this is a largely positive change. HTML5 and HLS are open specifications, which means that users can modify them to their specifications and anyone can access them free of cost. These newer HTML5 and HLS streaming protocols also safer, more reliable, and faster than earlier technologies.
For content producers, there are also some major advantages to using these new live streamingtechnologies. However, there are disadvantages in this realm of content production. In particular, these downsides include the work involved in replacing legacy systems and technologies with new standards that may not work the same across all platforms. As will all technological innovations, growing pains are inevitable.
To get you up to speed on these changes, we’ve geared this article at both longtime broadcasters and newcomers to streaming media, all with a focus on HLS streaming. Our goal here is to make this content relevant for all kinds of streamers. Whether you do live streaming of sports events, or you want to stream live video on your website, we hope you’ll find it useful! We’ll cover basic streaming protocol definitions, discuss other streaming protocols, and, of course answer the question posed in the title of this essay: what is HLS streaming and when should you use it?
What is HLS?
HLS stands for HTTP Live Streaming. Put succinctly, HLS is a media streaming protocol for delivering visual and audio media to viewers over the internet.
The HLS streaming protocol chops up MP4 video content into short, 10 second chunks. HTTP then delivers these short clips to viewers. This technology makes HLS compatible with a wide range of devices and firewalls. Latency (or lag time) for HLS live streams compliant with the specification tends to be in the 15-30 second range. This is certainly an important factor to keep in mind.
When it comes to quality, HLS streaming stands out from the pack. On the server side, content creators often have the option to encode the same live stream at multiple quality settings. In turn, players can dynamically request the best option available, given their specific bandwidth at any given moment. From chunk to chunk, the data quality can differ.
For example, in one moment you might be sending full high-definition video. Moments later, a mobile user may encounter a “dead zone” in which their quality of service declines. The player can detect this decline in bandwidth and begin delivering lower-quality movie chunks at this time. The point of all this? HLS streaming reduces buffering, stuttering, and other problems.
HLS streaming format history
Apple originally launched the HLS streaming protocol in summer 2009. They timed this release to coincide with the debut of the iPhone 3. Previous iPhone models had experienced many problems with streaming media online, partially because these devices often switched between Wi-Fi and mobile networks mid-stream.
Prior to the release of HLS, Apple used the Quicktime Streaming Server as its media streaming standard. Though it was a robust service, Quicktime used non-standard ports for data transfer and so firewalls often blocked its RTSP protocol was often blocked. Combined with slow average internet speeds, these limitations doomed Quicktime Streaming Server. As a result, this early experiment in live streaming technology never reached a wide audience. That said, HTTP Live Streaming ultimately drew from the lessons learned from creating and rolling out the Quicktime service.
HLS streams are generated on the fly, and an HTTP server stores those streams. The protocol splits video files, as we’ve mentioned above, are into short segments with the .ts file extension (standing for MPEG2 Transport Stream).
The HTTP server also creates a .M3U8 playlist file (e.g., manifest file) that serves as an index for the video chunks. The playlist file serves as a bank that points towards additional index files for each of the existing quality options. Even when you choose to only broadcast using a single quality option, this file will still exist.
A given user’s video player software can detect deteriorating or improving network conditions. If either occur, the player software reads the main index playlist, determines which quality video is ideal, and then reads the quality-specific index file to determine which chunk of video corresponds to where the viewer is watching. And best of all–the entire process is seamless for the user.
HLS also supports closed captions embedded in the video stream. To learn more about HLS, we recommend the extensive documentation and best practices provided by Apple.
Review of video streaming protocols
Several companies have developed a variety of streaming solutions through the use of media streaming protocols. Generally, each of these solutions has represented a new innovation in the field of video streaming. Similar to the the HD-DVD vs. Blu-Ray format wars, or the older Betamax vs. VHS showdown, there are nonetheless conflicts that arise. HLS is currently the best option for streaming media protocols, but it wasn’t always that way—nor will it remain so forever. Let’s review several past and current streaming protocols to better understand the innovations that the HLS streaming protocol offers today.
Real-Time Messaging Protoco (RTMP) is a standard originally developed by Macromedia in the mid-200s. Designed for streaming audio and video in the mid-2000s, this protocol is frequently referred to simply as Flash. Macromedia later merged with Adobe, which now develops RTMP as a semi-open standard.
For much of the past decade, RTMP was the default video streaming method on the internet. Only with the recent rise of HLS have we seen a decline in the usage of RTMP. Even today, most streaming video hosting services work with RTMP ingestion. In other words, you deliver your stream to your online video platform in RTMP stream format. From there, your OVP usually delivers your stream to your viewers via HLS.
In recent years, however, even this legacy use of RTMP streams is beginning to fade. More and more CDNs (Content Delivery Networks) are beginning to depreciate RTMP support.
Known as Adobe’s next-gen streaming, HDS actually stands for HTTP Dynamic Streaming. Designed for compatibility with Adobe’s Flash video browser plug-in, the overall adoption of HDS is relatively small compared to HLS.
Here at DaCast, we use HDS to deliver some of our VOD (Video On Demand) content. For devices and browsers that do support Flash video, HDS can be a robust choice with lower latency. Like HLS, the HDS protocol splits media files into small chunks. HDS provides advanced encryption and DRM features. It also uses an advanced key frame method to ensure that chunks align with one another.
Microsoft Smooth Streaming
Microsoft Smooth Streaming (MSS) is Microsoft’s version of a live streaming protocol. Smooth Streaming also uses the adaptive bitrate approach, delivering the best quality available at any given time.
First introduced in 2008, MSS was one of the first adaptive bitrate methods to hit the public realm. MSS protocol helped to broadcast the 2008 Summer Olympics that year. The most widely used MSS platform is actually the XBox One. However, MSS is one of the less popular streaming protocols around today. HLS should be considered the default method over this lesser used approach.
Last up, the newest entry in the streaming protocol format wars is MPEG-DASH. The DASH stands for Dynamic Adaptive Streaming (over HTTP).
MPEG-DASH comes with several advantages. First of all, it is the first international standard streaming protocol based on HTTP. This feature has helped to quicken the process of widespread adoption. For the moment, MPEG-DASH is a new protocol and isn’t widely used across the streaming industry. However, like the rest of the industry, we expect MPEG-DASH to become the de facto standard for streaming within a couple of years.
One major advantage of MPEG-DASH is that this protocol is “codec agnostic.” Simply put, this means that the video or media files sent via MPEG-DASH can utilize a variety of encoding formats. These encoding formats include widely supported standards like H.264, as well as next-gen video formats like HEVC/H.265 and VP10.
HLS streaming advantages
As this article highlights, HLS has a major advantage in terms of streaming video quality. Broadcasters can deliver streams using the adaptive bitrate process supported by HLS. That way, each viewer can receive the best quality stream for their internet connection at any given moment.
The HLS streaming protocol is also widely supported. Originally limited to iOS devices like iPhones, iPads, and the iPod Touch, all Google Chrome browsers, in Safari and Microsoft Edge, and on iOS, Android, Linux, Microsoft, and MacOS platforms now natively support the HLS streaming protocol.
Takeaway: For now and at least the shorter-term future, HLS is the definitive default standard for live streaming content.
When to use HLS streaming?
We recommend adopting the HLS streaming protocol all of the time. It is the most up-to-date and widely used protocol for media streaming. It does have one disadvantage, which we mention above–HLS has a relatively higher latency than some other protocols. This means that HLS streams are not quite as “live.” I nfact, with HLS viewers can experience delays of up to 30 seconds (or more, in some cases). However, for most broadcasters this isn’t a problem. The vast majority of live streams can handle a delay like that without causing any sort of user dissatisfaction.
Streaming to mobile devices
HLS is mandatory for streaming to mobile devices and tablets. Given that mobile devices now make up the majority of internet traffic (around 75% of traffic in 2017), HLS is essential for these users as well.
Streaming with an HTML5 video player
Native HTML5 video doesn’t support RTMP or HDS. Therefore, if you want to use a purely HTML5 video player, HLS is the only choice. Along with reaching mobile devices, these considerations point towards HLS as the default standard. If you’re stuck using Flash technology for the moment, RTMP will be a better delivery method—but only if you have no other option.
Building an RTMP -> HLS workflow
If you’re using a service like DaCast for your online video platform, you’ll need to build a workflow that begins as RTMP. This is much simpler than it sounds. Essentially, you simply need to configure your hardware or software encoder to deliver an RTMP stream to the DaCast servers. Most encoders default to RTMP, and quite a few only support that standard.
Our CDN partner, Akamai, ingests the RTMP stream and automatically rebroadcasts it via both HLS and RTMP. From there, users default to the best supported method on their own devices.
Using HLS is relatively straightforward. On DaCast, all live streams default to HLS delivery. On computers that support Flash, we do fall back on RTMP/Flash in order to reduce latency. However, HLS is supported automatically on every DaCast live stream, and used on almost all devices.
HLS streaming is delivered by means of an M3U8 file. This file is essentially a playlist that contains references to the location of media files. On a local machine, this would be file paths. For live streaming on the internet, an M3U8 file will contain a URL. That’s the URL on which your stream is being delivered.
Using an HTML5 video player
We’ve written extensively about the transition from Flash-based video (usually delivered via RTMP) to HTML5 video (usually delivered using HLS). Check out this blog for more on that subject, including why it’s important to use an HTML5 video player.
If you’re streaming over the DaCast platform, not to worry! You’re already using a fully compatible HTML5 video player. Content delivered via DaCast defaults to HTML5 delivery. However, it will use Flash as a backup method if HTML5 is not supported on a given device or browser. This means that even older devices will have no problem playing your content over your DaCast account.
Of course, some users may wish to use a custom video player. Luckily, it’s quite simple to embed your HLS stream in any video player. For example, if you’re using JW Player, just insert the M38U reference URL into the code for your video player. For example:
var playerInstance = jwplayer("myElement");
The future of live streaming
While HLS is the current gold standard for live streaming, it won’t stay that way indefinitely. We expect MPEG-DASH to become increasingly popular in the coming years.
As MPEG-DASH becomes more and more commonly used, we’ll see other changes as well, like the transition away from h.264 encoding to h.265/HEVC. This new compression standard provides much smaller file sizes, making 4K live streaming a real possibility.
However, that time hasn’t come yet. For now, it’s more important to stick with the established standards in order to reach as many users as possible.
Our goal in this article has been to introduce you to the HLS protocol for streaming media. We’ve discussed what HLS is, how it works, and when to use it. We’ve also reviewed some alternative options in terms of streaming protocols. After reading, we hope you now have a solid foundation in HLS streaming technology and its future.
To recap, HLS is widely supported, high-quality, and robust. All streamers should be familiar with the protocol, even if they don’t understand all the technical details. This is true for all kinds of streaming, including if you want to stream live video on your website via the DaCast online video platform.
You can do your first HLS live stream today with our video streaming solution. Take advantage of our free 30-day trial (no credit card required).