In the past, Adobe’s Flash video technology was the main method of delivering video via the internet. In recent years, however, there’s been a major shift in the world of online video. Specifically, online video delivered by protocols like HLS streaming and played by HTML5 video players has increasingly replaced Adobe’s Flash protocol.
For both broadcasters and viewers, this is a largely positive change. Why? First, HTML5 and HLS are open specifications. Therefore, users can modify them to their specifications and anyone can access them free of cost. Additionally, these newer HTML5 and HLS streaming protocols are safer, more reliable, and faster than earlier technologies.
For content producers, HTML5 and HLS live streaming technologies also bring some major advantages. However, there are also some notable disadvantages within this realm of content production. In particular, it can take considerable time and effort to replace legacy systems and technologies with new standards, which may not work the same across all streaming platforms. As with all technological innovations, growing pains are inevitable.
To get you up to speed on these changes, we’ve framed this article for both longtime professional broadcasters and newcomers to streaming media. Whether you do live event streaming, or you want to stream live from your website, we’ve got you covered. Overall, our focus here is on HLS video streaming.
We’ll also discuss the role of HTML5 video streaming as it relates to HLS. In particular, we’ll cover basic streaming protocol definitions, other streaming protocols, and provide a detailed overview of the main topic of this post: what is HLS streaming and when should you use it?
Please note that we have updated this post to reflect the most current information about HLS streaming in April 2021. We have added information about new protocols, M3U8, HTML5 video players and more.
HLS stands for HTTP Live Streaming. In short, HLS is a media streaming protocol for delivering visual and audio media to viewers over the internet.
Apple first launched the HTTP live streaming (HLS) protocol in the summer of 2009. Apple timed this release to coincide with the debut of the iPhone 3. As you may recall, previous iPhone models had experienced many problems with streaming media online. These issues arose, at least in part, because those older devices often switched between Wi-Fi and mobile networks mid-stream.
Prior to the release of HLS, Apple used the Quicktime Streaming Server as its media streaming standard. Though a robust service, Quicktime used non-standard ports for data transfer and so firewalls often blocked its RTSP protocol.
Combined with slow average internet speeds, these limitations doomed Quicktime Streaming Server. As a result, this early experiment in live streaming technology never reached a wide audience.
That said, HTTP Live Streaming ultimately drew from the lessons learned from creating and rolling out the Quicktime service.
We’ve covered the matter-of-fact definition of HLS, but before we move on to an equally technical overview of how this protocol works, we’re going to take a chance to go back to the basics.
As we’ve mentioned, HLS is an important protocol for live streaming. The live streaming process that is compatible with the greatest number of devices and browsers looks something a little bit like this:
This process requires two main software solutions: a live video HLS encoder and a powerful video hosting platform. If you choose to stream with HLS, you’ll want to make sure that both software offers the protocols and features we mentioned.
HTML5 video players powered by HLS are great for reaching the largest audience since this duo is practically universal. Dacast is a feature-rich live video streaming solution that includes HLS streaming and a customizable, white-label HTML5 video player.
Want to see the HLS streaming protocol in action? Sign up for your free trial of Dacast to try out HLS ingest and other powerful streaming features.
With that background in mind, how does HLS streaming technology work?
First, the HLS protocol chops up MP4 video content into short (10-second) chunks with the .ts file extension (MPEG2 Transport Stream). Next, an HTTP server stores those streams, and HTTP delivers these short clips to viewers on their devices.
HLS will play video encoded with the H.264 or HEVC/H.265 codecs.
The HTTP server also creates an M3U8 playlist file (e.g. manifest file) that serves as an index for the video chunks. That way, even if you choose to broadcast live using only a single quality option, the file will still exist.
Now, let’s consider how playback quality works with HLS video streaming. With this protocol, a given user’s video player software (like an HTML5 video player) detects deteriorating or improving network conditions.
If either occurs, the player software first reads the main index playlist and determines which quality video is ideal. Then the software reads the quality-specific index file to determine which chunk of video corresponds to the point at which the viewer is watching. If you’re streaming with Dacast, you can use your M3U8 online player to test your HLS stream. Though this may sound technically complex, the entire process is seamless for the user.
One key benefit of this protocol relates to its compatibility features. Unlike other streaming formats, HLS is compatible with a wide range of devices and firewalls. However, latency (or lag time) tends to be in the 15-30 second range with HLS live streams.
This is certainly an important factor to keep in mind. Note that Dacast now offers an HLS direct low latency streaming feature, which works with any HLS-compatible encoder.
When it comes to quality, versatility makes HLS video streaming stand out from the pack. On the server-side, content creators often have the option to encode the same live stream at multiple quality settings.
In turn, viewers can dynamically request the best option available, given their specific bandwidth at any given moment. In other words, from chunk to chunk the data quality can differ to fit different streaming device capabilities.
This is best explained with an example. Let’s say, in one moment, you’re sending a full high-definition video. Moments later, a mobile user encounters a “dead zone” in which their quality of service declines. With HLS streaming, this is not an issue. The player will detect this decline in bandwidth and instead deliver lower-quality movie chunks at this time.
Several companies have developed a variety of streaming solutions through the use of media streaming protocols. Generally, each of these solutions has represented a new innovation in the field of video streaming. However, similar to the HD-DVD vs. Blu-Ray format wars, or the even older Betamax vs. VHS showdown, industry conflicts can arise.
HLS is currently the best option for streaming media protocols, but it wasn’t always that way—nor will it remain so forever. Let’s review several past and current streaming protocols to better understand the innovations that the HLS streaming protocol offers today.
Known as Adobe’s next-gen streaming, HDS actually stands for HTTP Dynamic Streaming. This protocol was designed specifically for compatibility with Adobe’s Flash video browser plug-in. Therefore, the overall adoption of HDS is relatively small compared to HLS.
Here at Dacast, we use HDS to deliver some of our VOD (Video On Demand) content. For devices and browsers that do support Flash video, HDS can be a robust choice with lower latency. Like HLS, the HDS protocol splits media files into small chunks. HDS also provides advanced encryption and DRM features. Finally, it uses an advanced keyframe method to ensure that chunks align with one another.
Macromedia developed RTMP (Real-Time Messaging Protocol) in the mid-2000s. Designed for streaming both audio and video, many know this protocol simply as Flash. Macromedia later merged with Adobe, which now develops RTMP as a semi-open standard.
For much of the past decade, RTMP was the default video streaming method on the internet. But with the recent rise of HLS, we’ve seen a decline in the usage of RTMP. Even today, most streaming video hosting services work with RTMP encoders to ingest live streams via HLS. In other words, broadcasters deliver their streams to their chosen online video platform in RTMP stream format. Then, the OVP usually delivers those streams to viewers via HLS, and that includes in-China video hosting, which Dacast now offers.
In recent years, even this legacy use of RTMP streams is beginning to fade. More and more CDNs (Content Delivery Networks) are beginning to depreciate RTMP support.
Next up is streaming protocol MSS (Microsoft Smooth Streaming). As the name implies, it’s Microsoft’s version of a live streaming protocol. Smooth Streaming also uses the adaptive bitrate approach, delivering the best quality available at any given time.
First introduced in 2008, MSS was one of the first adaptive bitrate methods to hit the public realm. In fact, MSS protocol helped to broadcast the 2008 Summer Olympics that year. The most widely-used MSS platform today is the Xbox One. However, MSS is one of the less popular streaming protocols available today. In almost all cases, HLS should be the default method over this lesser-used approach.
MPEG-DASH comes with several advantages. First of all, it is the first international standard streaming protocol based on HTTP. This feature has helped to quicken the process of widespread adoption. For now, MPEG-DASH is a relatively new protocol and isn’t widely used across the streaming industry. However, like the rest of the industry, we expect MPEG-DASH to become the de facto standard for streaming within a couple of years.
One major advantage of MPEG-DASH is that this protocol is “codec agnostic.” Simply put, this means that the video or media files sent via MPEG-DASH can utilize a variety of encoding formats. These encoding formats include widely supported standards like H.264 (as with HLS streaming protocol), as well as next-gen video formats like HEVC/H.265 and VP10. And like HLS, MPEG-DASH is an adaptive bitrate streaming video method.
For a more detailed comparison, you can also review this blog post on MPEG-DASH versus HLS streaming protocols.
Real-time streaming protocol, or RTSP for short, is a protocol that helps manage and control live stream content rather than actually transmitting the content. It is considered a “presentation layer protocol.”
It is a pretty old protocol, having originally been developed in the late 1990s. RTSP was developed in collaboration with Columbia University, Real Network, and Netscape.
RTSP is known for having extremely low latency, which is certainly a plus.
Unfortunately, this protocol comes with a slew of limitations. For starters, when comparing RSTP vs RTMP, it is not as compatible nor adaptable with modern video players and devices. Unlike RTMP, it is not compatible with streaming over HTTP in a web browser, nor is it easy to scale.
In the first half of this article, we covered a major advantage of HLS over other protocols in terms of streaming video quality. In particular, broadcasters can deliver streams using the adaptive bitrate process supported by HLS. That way, each viewer can receive the best quality stream for their internet connection at any given moment.
This protocol includes a number of other key benefits, including embedded closed captions, synchronized playback of multiple streams, good advertising standards support, DRM support, and more.
The takeaway here for broadcasters? For now and at least the shorter-term future, HLS is the definitive default standard for live streaming content.
The HLS streaming protocol is also widely supported across multiple devices and browsers.
Originally limited to iOS devices like iPhones, iPads, and the iPod Touch, HLS is now supported by the following devices and browsers:
At this point, HLS is a nearly universal protocol.
Currently, we recommend that broadcasters adopt the HLS streaming protocol all of the time. It is the most up-to-date and widely used protocol for media streaming. In this Video Streaming Latency Report, for example, 45% of broadcasters reported using HLS streaming.
RTMP came in second with 33% of broadcasters using that alternative. And MPEG-DASH trailed behind even further, used by only 7% of broadcasters.
Remember, native HTML5 video doesn’t support RTMP or HDS. Therefore, if you want to use a purely HTML5 video player, HLS is the only choice.
Along with reaching mobile devices, these considerations point towards HLS as the default standard. If you’re stuck using Flash technology for the moment, RTMP will be a better delivery method—but only if you have no other option.
HLS streaming does have one disadvantage, which we mentioned above. Namely, it has a relatively higher latency than some other protocols. This means that HLS streams are not quite as “live” as the term live streaming suggests.
Generally, with HLS viewers can experience delays of up to 30 seconds (or more, in some cases). That said, this isn’t a problem for most broadcasters. The vast majority of live streams can handle that kind of delay without causing any sort of user dissatisfaction.
One protocol that works well to reduce latency with HLS video streaming is Low-Latency CMAF for DASH. This protocol works with the content delivery network and HTML5 video player to carry the weight where HLS streaming is lacking.
So we’ve covered what HLS is, how it works, and when to use it. We’ve also looked at alternative streaming protocols from the past and present. Now, let’s talk through how to build an RTMP Ingest to HLS workflow.
If you’re using a streaming service like Dacast, you need to build a workflow that begins as RTMP. This is much simpler than it sounds. Essentially, you simply need to configure your hardware or software encoder to deliver an RTMP stream to the Dacast servers. Most encoders default to RTMP, and quite a few only support that standard.
For Dacast users, our CDN partners then ingest the RTMP stream and automatically rebroadcast it via both HLS and RTMP. From there, viewers default to the best-supported method on their own devices.
Using HLS is relatively straightforward with a professional, full-service OVP. On Dacast, all live streams default to HLS delivery. On computers that support Flash, we do fall back on RTMP/Flash in order to reduce latency. However, HLS is supported automatically on every Dacast live stream and used on almost all devices.
As we discussed above, HLS streaming is delivered through an M3U8 file. This file is a kind of playlist that contains references to the location of media files. On a local machine, these would consist of file paths. For live streaming on the internet, that M3U8 file would contain a URL (the one on which your stream is being delivered).
Another relevant process to quickly note is transmuxing. Transmuxing is the process that repackages content files without distorting the content itself. This allows the content to flow more easily between software via the RTMP and HLS protocols.
As mentioned above, the HLS protocol has become the go-to approach for streaming content with HTML5 video players. If you’re not familiar with HTML5 video streaming, it’s one of the three main approaches to video streaming today.
With HTML5, the content-hosting website uses native HTTP to stream the media directly to viewers. Content tags (e.g., <video> tag) are included as part of the HTML code.
As a result, the <video> tag creates a native HTML5 video player within your browser. These tags provide direction to the HTTP protocol (HLS), as to what to do with this content. HTTP displays the text, and an audio player plays audio content.
Like HLS, HTML5 is customizable for broadcasters and free for viewers. You can check out our related post on optimizing HTML5 video players with HLS to learn more.
We’ve also written extensively about the transition from Flash-based video (usually delivered via RTMP) to HTML5 video (usually delivered using HLS). Check out our “Flash is Dead” blog post for more on that subject, including why it’s important to use an HTML5 video player.
If you’re streaming over the Dacast, you’re already using a fully-compatible HTML5 video player. Content delivered via Dacast defaults to HTML5 delivery. However, it will use Flash as a backup method if HTML5 is not supported on a given device or browser. This means that even older devices will have no problem playing your content over your Dacast account.
Of course, some broadcasters may prefer to use a custom video player. Luckily, it’s quite simple to embed your HLS stream within any video player. For example, if you’re using JW Player, simply insert the M3U8 reference URL into the code for your video player. Here’s a visual example:
var playerInstance = jwplayer("myElement");
Another note about using HLS and an HTML5 video player with Dacast is that Dacast uses the THEOplayer. THEOplayer is a universal video player that can be embedded in websites, mobile apps, and pretty much any platform that you can think of.
As we’ve mentioned before, compatibility is key when choosing video players and protocols since you want to be able to reach the greatest number of people.
Before wrapping things up, let’s recap our discussion of some of the advantages of the HLS streaming protocol. First, there is no special infrastructure required to deliver HLS content. In fact, any standard web server or CDN will function well. Additionally, firewalls are much less likely to block content using HLS.
In terms of technical functionality, HLS will play video encoded with the H.264 or HEVC/H.265 codecs. It then chops the video into 10-second segments. Remember, latency for delivery tends to be in the 30-second range. However, Dacast now has a solution for low-latency HLS live streaming that reduces latency to 10 seconds or less.
The HLS protocol also includes several other built-in features. For example, HLS is an adaptive bitrate streaming protocol. This means that the client device and server dynamically detect the internet speed of the user, and then adjust video quality in response.
Other beneficial HLS features include support for embedded closed captions, synchronized playback of multiple streams, advertising standards (i.e., VPAID and VAST), DRM, and more.
While HLS is the current gold standard for live streaming, it won’t stay that way indefinitely. We expect MPEG-DASH to become increasingly popular in the coming years.
As that shift takes place, we’ll see other changes as well, such as the transition away from h.264 encoding to h.265/HEVC. This new compression standard provides much smaller file sizes, making 4K live-streaming a real possibility.
However, that time isn’t here yet. For now, it’s important to stick with the established standards to reach as many users as possible on their devices. In other words, HLS is the streaming protocol of the present.
Today, HLS is widely supported, high-quality, and robust. All streamers should be familiar with the protocol, even if they don’t understand all of the technical details. This is true for all kinds of streaming, including live broadcasting over the Dacast live streaming platform.
Our goal in this article was to introduce you to the HLS protocol for streaming media. We’ve discussed what HLS is, how it works, and when to use it, as well as how it compares to other streaming protocols out there. After reading, we hope you have a solid foundation in HLS streaming technology and its future.
You can do your first HLS live stream today with our video streaming solution. If you’re ready to try it today, you can take advantage of our free 30-day trial. No credit card required.
And for exclusive offers and regular live streaming tips, you’re also invited to join our LinkedIn group.
Finally, do you have further questions, thoughts, or feedback about this article? We’d love to hear from you in the comments below, and we will get back to you. Thanks for reading!
Stay up to date with our latest features and product releases