What is HLS Streaming and when should you use it?

Until fairly recently, Adobe’s Flash video technology had been the main method of delivering video via the internet. Today, however, there’s a major shift taking place in the world of online video. Over the past decade, Adobe’s Flash protocol has been replaced increasingly by video delivered using protocols like HLS streaming and played in HTML5 video players.

For broadcasters and viewers alike, this is a largely positive change. HTML5 and HLS are open specifications, which means that users can modify them to their specifications and anyone can access them free of cost. These newer HTML5 and HLS streaming protocols also safer, more reliable, and faster than earlier technologies.

For content producers, there are also some major advantages to using these new live streaming technologies. However, there are disadvantages in this realm of content production. In particular, these downsides include the work involved in replacing legacy systems and technologies with new standards that may not work the same across all platforms. As will all technological innovations, growing pains are inevitable.

To get you up to speed on these changes, we’ve geared this article at both longtime broadcasters and newcomers to streaming media, all with a focus on HLS streaming. Our goal here is to make this content relevant for all kinds of streamers. Whether you do live streaming of sports events, or you want to stream live video on your website, we hope you’ll find it useful! We’ll cover basic streaming protocol definitions, discuss other streaming protocols, and, of course answer the question posed in the title of this essay: what is HLS streaming and when should you use it?

What is HLS?

HLS stands for HTTP Live Streaming. Put succinctly, HLS is a media streaming protocol for delivering visual and audio media to viewers over the internet.

HLS streamingThe HLS streaming protocol chops up MP4 video content into short, 10 second chunks. HTTP then delivers these short clips to viewers. This technology makes HLS compatible with a wide range of devices and firewalls. Latency (or lag time) for HLS live streams compliant with the specification tends to be in the 15-30 second range. This is certainly an important factor to keep in mind.

When it comes to quality, HLS streaming stands out from the pack. On the server side, content creators often have the option to encode the same live stream at multiple quality settings. In turn, players can dynamically request the best option available, given their specific bandwidth at any given moment. From chunk to chunk, the data quality can differ.

For example, in one moment you might be sending full high-definition video. Moments later, a mobile user may encounter a “dead zone” in which their quality of service declines. The player can detect this decline in bandwidth and begin delivering lower-quality movie chunks at this time. The point of all this? HLS streaming reduces buffering, stuttering, and other problems.

HLS streaming format history

HLS streamingApple originally launched the HLS streaming protocol in summer 2009. They timed this release to coincide with the debut of the iPhone 3. Previous iPhone models had experienced many problems with streaming media online, partially because these devices often switched between Wi-Fi and mobile networks mid-stream.

Prior to the release of HLS, Apple used the Quicktime Streaming Server as its media streaming standard. Though it was a robust service, Quicktime used non-standard ports for data transfer and so firewalls often blocked its RTSP protocol was often blocked. Combined with slow average internet speeds, these limitations doomed Quicktime Streaming Server. As a result, this early experiment in live streaming technology never reached a wide audience. That said, HTTP Live Streaming ultimately drew from the lessons learned from creating and rolling out the Quicktime service.

Technical overview

HLS streams are generated on the fly, and an HTTP server stores those streams. The protocol splits video files, as we’ve mentioned above, are into short segments with the .ts file extension (standing for MPEG2 Transport Stream).

HLS streamingThe HTTP server also creates a .M3U8 playlist file (e.g., manifest file) that serves as an index for the video chunks. The playlist file serves as a bank that points towards additional index files for each of the existing quality options. Even when you choose to only broadcast using a single quality option, this file will still exist.

A given user’s video player software can detect deteriorating or improving network conditions. If either occur, the player software reads the main index playlist, determines which quality video is ideal, and then reads the quality-specific index file to determine which chunk of video corresponds to where the viewer is watching. And best of all–the entire process is seamless for the user.

HLS also supports closed captions embedded in the video stream. To learn more about HLS, we recommend the extensive documentation and best practices provided by Apple.

Review of video streaming protocols

Several companies have developed a variety of streaming solutions through the use of media streaming protocols. Generally, each of these solutions has represented a new innovation in the field of video streaming. Similar to the the HD-DVD vs. Blu-Ray format wars, or the older Betamax vs. VHS showdown, there are nonetheless conflicts that arise. HLS is currently the best option for streaming media protocols, but it wasn’t always that way—nor will it remain so forever. Let’s review several past and current streaming protocols to better understand the innovations that the HLS streaming protocol offers today.

RTMP

HLS streamingReal-Time Messaging Protoco (RTMP) is a standard originally developed by Macromedia in the mid-200s. Designed for streaming audio and video in the mid-2000s, this protocol is frequently referred to simply as Flash. Macromedia later merged with Adobe, which now develops RTMP as a semi-open standard.

For much of the past decade, RTMP was the default video streaming method on the internet. Only with the recent rise of HLS have we seen a decline in the usage of RTMP. Even today, most streaming video hosting services work with RTMP ingestion. In other words, you deliver your stream to your online video platform in RTMP stream format. From there, your OVP usually delivers your stream to your viewers via HLS.

In recent years, however, even this legacy use of RTMP streams is beginning to fade. More and more CDNs (Content Delivery Networks) are beginning to depreciate RTMP support.

HDS

Known as Adobe’s next-gen streaming, HDS actually stands for HTTP Dynamic Streaming. Designed for compatibility with Adobe’s Flash video browser plug-in, the overall adoption of HDS is relatively small compared to HLS.

Here at DaCast, we use HDS to deliver some of our VOD (Video On Demand) content. For devices and browsers that do support Flash video, HDS can be a robust choice with lower latency. Like HLS, the HDS protocol splits media files into small chunks. HDS provides advanced encryption and DRM features. It also uses an advanced key frame method to ensure that chunks align with one another.

Microsoft Smooth Streaming

HLS streamingMicrosoft Smooth Streaming (MSS) is Microsoft’s version of a live streaming protocol. Smooth Streaming also uses the adaptive bitrate approach, delivering the best quality available at any given time.

First introduced in 2008, MSS was one of the first adaptive bitrate methods to hit the public realm. MSS protocol helped to broadcast the 2008 Summer Olympics that year. The most widely used MSS platform is actually the XBox One. However, MSS is one of the less popular streaming protocols around today. HLS should be considered the default method over this lesser used approach.

MPEG-DASH

Last up, the newest entry in the streaming protocol format wars is MPEG-DASH. The DASH stands for Dynamic Adaptive Streaming (over HTTP).

MPEG-DASH comes with several advantages. First of all, it is the first international standard streaming protocol based on HTTP. This feature has helped to quicken the process of widespread adoption. For the moment, MPEG-DASH is a new protocol and isn’t widely used across the streaming industry. However, like the rest of the industry, we expect MPEG-DASH to become the de facto standard for streaming within a couple of years.

One major advantage of MPEG-DASH is that this protocol is “codec agnostic.” Simply put, this means that the video or media files sent via MPEG-DASH can utilize a variety of encoding formats. These encoding formats include widely supported standards like H.264, as well as next-gen video formats like HEVC/H.265 and VP10.

HLS streaming advantages

HLS streamingAs this article highlights, HLS has a major advantage in terms of streaming video quality. Broadcasters can deliver streams using the adaptive bitrate process supported by HLS. That way, each viewer can receive the best quality stream for their internet connection at any given moment.

The HLS streaming protocol is also widely supported. Originally limited to iOS devices like iPhones, iPads, and the iPod Touch, all Google Chrome browsers, in Safari and Microsoft Edge, and on iOS, Android, Linux, Microsoft, and MacOS platforms now natively support the HLS streaming protocol.

Takeaway: For now and at least the shorter-term future, HLS is the definitive default standard for live streaming content.

When to use HLS streaming?

We recommend adopting the HLS streaming protocol all of the time. It is the most up-to-date and widely used protocol for media streaming. It does have one disadvantage, which we mention above–HLS has a relatively higher latency than some other protocols. This means that HLS streams are not quite as “live.” I nfact, with HLS viewers can experience delays of up to 30 seconds (or more, in some cases). However, for most broadcasters this isn’t a problem. The vast majority of live streams can handle a delay like that without causing any sort of user dissatisfaction.

HLS streamingStreaming to mobile devices

HLS is mandatory for streaming to mobile devices and tablets. Given that mobile devices now make up the majority of internet traffic (around 75% of traffic in 2017), HLS is essential for these users as well.

Streaming with an HTML5 video player

Native HTML5 video doesn’t support RTMP or HDS. Therefore, if you want to use a purely HTML5 video player, HLS is the only choice. Along with reaching mobile devices, these considerations point towards HLS as the default standard. If you’re stuck using Flash technology for the moment, RTMP will be a better delivery method—but only if you have no other option.

Building an RTMP -> HLS workflow

If you’re using a service like DaCast for your online video platform, you’ll need to build a workflow that begins as RTMP. This is much simpler than it sounds. Essentially, you simply need to configure your hardware or software encoder to deliver an RTMP stream to the DaCast servers. Most encoders default to RTMP, and quite a few only support that standard.

HLS streamingOur CDN partner, Akamai, ingests the RTMP stream and automatically rebroadcasts it via both HLS and RTMP. From there, users default to the best supported method on their own devices.

Using HLS is relatively straightforward. On DaCast, all live streams default to HLS delivery. On computers that support Flash, we do fall back on RTMP/Flash in order to reduce latency. However, HLS is supported automatically on every DaCast live stream, and used on almost all devices.

HLS streaming is delivered by means of an M3U8 file. This file is essentially a playlist that contains references to the location of media files. On a local machine, this would be file paths. For live streaming on the internet, an M3U8 file will contain a URL. That’s the URL on which your stream is being delivered.

Using an HTML5 video player

We’ve written extensively about the transition from Flash-based video (usually delivered via RTMP) to HTML5 video (usually delivered using HLS). Check out this blog for more on that subject, including why it’s important to use an HTML5 video player.

HLS streamingIf you’re streaming over the DaCast platform, not to worry! You’re already using a fully compatible HTML5 video player. Content delivered via DaCast defaults to HTML5 delivery. However, it will use Flash as a backup method if HTML5 is not supported on a given device or browser. This means that even older devices will have no problem playing your content over your DaCast account.

Of course, some users may wish to use a custom video player. Luckily, it’s quite simple to embed your HLS stream in any video player. For example, if you’re using JW Player, just insert the M38U reference URL into the code for your video player. For example:

var playerInstance = jwplayer("myElement");
playerInstance.setup({
file: "/assets/myVideoStream.m3u8",
image: "/assets/myPoster.jpg"
});

The future of live streaming

HLS streamingWhile HLS is the current gold standard for live streaming, it won’t stay that way indefinitely. We expect MPEG-DASH to become increasingly popular in the coming years.

As MPEG-DASH becomes more and more commonly used, we’ll see other changes as well, like the transition away from h.264 encoding to h.265/HEVC. This new compression standard provides much smaller file sizes, making 4K live streaming a real possibility.

However, that time hasn’t come yet. For now, it’s more important to stick with the established standards in order to reach as many users as possible.

Conclusion

Our goal in this article has been to introduce you to the HLS protocol for streaming media. We’ve discussed what HLS is, how it works, and when to use it. We’ve also reviewed some alternative options in terms of streaming protocols. After reading, we hope you now have a solid foundation in HLS streaming technology and its future.

To recap, HLS is widely supported, high-quality, and robust. All streamers should be familiar with the protocol, even if they don’t understand all the technical details. This is true for all kinds of streaming, including if you want to stream live video on your website via the DaCast online video platform.

You can do your first HLS live stream today with our video streaming solution. Take advantage of our free 30-day trial (no credit card required).

button

For exclusive offers and regular live streaming tips, you’re also to join our LinkedIn group.

Have further questions, thoughts, or feedback about this article? We’d love to hear from you in the comments below. Thanks for reading, and good luck with your events.

By Max Wilbert.

13 thoughts on “What is HLS Streaming and when should you use it?

  1. Robert Ballantyne says:

    Thanks for this very useful article!

    Our group is interested in creating live broadcasts (actually narrowcasts… we won’t have a large audience, nor could we afford too many streams). The concept of any kind of ‘casting conjures up the vision of old-fashion radio and television broadcasting. In 2017 we expect to be directing our programming to computers and mobile devices — not television sets. We’d like our content to be interactive. In this regard we’d like to include a chat field along with the broadcast.

    So, my concern is your reference to ‘latency.’ You wrote, “Latency for HLS live streams compliant with the specification tends to be in the 15-30 second range.” That means the viewer commenting in the chat field is writing about something that happened many seconds ago. And that explains why the comments in the chat are far out of sync, and therefore out of context, with the broadcast.

    I don’t understand why this has to be the case. Our international group now meets online in video conferences several times each week. We are using Fuze.com (which I think is based on the Vidyo router technology) and Zoom.us (I don’t know how it works). Yesterday we had participants in the USA, Canada, and New Zealand. The video and audio was excellent (near HD). The latency was about the equivalent of 3 or 4 film frames (almost, but not quite, lip-sync).

    While it lasted, we enjoyed using Blab until the developers decided to stop providing that service. Now we are exploring Firetalk.

    Somehow these platforms can provide streaming live programming with just a tiny amount of latency. I would much prefer to use a service such as Dacast to assemble the kind of broadcast we envision (international video interview format + display content from a computer + the possibility of interacting with a chat field) and stream the program from our own website. I am not understanding why this is difficult (or impossible) and expensive.

    • DaCast says:

      Thank you reading us Robert!

      Broadcasting live video on internet requires to own a huge network of servers, worldwide. That alone explains the cost of this technology. Data centers cost a lot of money, way more than anyone would imagine.

      As for the 15-30 second delay, I do not understand your point here. The delay is between what the camera is filming and what the viewers are watching. Since the viewers are commenting on what they see, there is no delay on that side. A viewer will see an image and will comment on it instantly. If there is a 20sec delay between the ingest and the viewer, there will be the same delay with the chat.

      Anyway, you can absolutely use a professional video platform such as DaCast for the live video part of your project. You can even associate it to chat clients such as Chatwing. This solution works very well and is use daily by many of our customers.

      • Dacast Customer says:

        Robert is talking about people interacting via chat to the broadcast.
        Have you ever seen those old live TV programs where people watching are calling by phone and are live with the presenter? Typically quiz shows.
        Or live contests broadcasted by the TV, and the Presenter “opens” and “closes” the voting windows on a website or a phone center.

        In our case, a broadcast of a conference, with a Q&A session: people in the auditorium is asking questions raising their hands and receiving a mike, people watching in live streaming are sending questions via chat.
        30 seconds delay is disruptive: the speaker is receiving questions about items discussed 30 seconds ago, and the answer arrives another 30 seconds later…
        Moreover, when you ask:”Any other questions?” you have to have the entire auditiorium waiting silently for 30 seconds to be sure even people watching have no more questions….. imagine 200 people silent in an auditorium….

        This is a real issue. We gave up with Ustream, even though they were free, because their delay increased with the time, reaching quite two minutes after a 5 hour conference…. Dacast HTML5 is currently remaining under 30 secs, but I wouldn’t recommend it for critical conferences. Old Dacast Flash rtmp was below 10 seconds…

        • DaCast says:

          You answered the problem in the way you formulated the question. RTMP streaming simply isn’t a solution for multiple-way streaming as the conversion from RTMP to HLS is taking 15-20 seconds already. Maybe using a conferencing software or a conferencing platform would be smarter as they are developed for this purpose, on a totally different type of technology.
          On24.com is a very robust choice, being used by major companies. Just that the $7K / month isn’t affordable by everyone 🙂

          • Dacast Customer says:

            I beg to disagree.
            We are not looking for an interactive webinar, but to a broadcasting with a small portion with a return channel. Even in the auditorium, from the time someone raises his/her hand and the time an assistant with the mike reaches him/her, it takes 4/5 seconds. Since the return channel (the chat) is with no delay, a 4/5 seconds delay is “live”.
            In non-critical conferences (such as, free-of-charge or among internal members ones), even 10/15 seconds is acceptable. An RTMP only streaming (such as the old dacast flash one) was “live enough”.
            The introduction of HLS for the broadcasting leg, as you said, is adding those 15-20 distruptive seconds. We are still using Dacast for our internal free-of-charge conferences with members, because its a value for money solution.
            I had a look to On24.com but is well more than we need. More towards Adobe Connect or Cisco WebEx, than a Conference with a chat. We tested livestream.com and it remained within the 5 seconds delay. Unfortunately, to have control on embedding location, so to restrict access only to members, you need a $799/month contract: 1/10 of On24.com, but 15 times a 2TB Dacast event pricing.

          • DaCast says:

            You got the idea! On24 quote me a $7K plan last time I reached their sales number. Livestream is based on the exact same technology as we are, but are charging a lot more because of their branding and expenses.

            One solution you should experience is to go with FLASH instead of HTML5. The delays are much shorter (few seconds only) so it would be a good fit if you are not trying to reach mobile users (as FLASH isn’t working on all smartphones and tablets).

            The way DaCast has been developed, it is unfortunately not possible to get a return channel.

          • DaCast says:

            Or, as I was saying on another thread, it is always possible to handle the the inputs and outputs before going through the video encoder. You could for instance realize a Skype interview, and then use the windows as a input source in your encoder. Than way, your viewer would be able to view video coming from multiple sources.

  2. Graham Lowe says:

    Thanks for the information. We are using DaCast for our live stream using Adobe FMLE and after reading this article I said I better plan an upgrade to get away from flash. I spent considerable amount of time and found a device that would fit into our environment nicely and it streams RTMP and HLS. Great, so I added this device to our budget and plans however I just went to see how easy it would be to configure HLS to use with DaCast live stream and was a bit shocked to see it’s not supported by DaCast, just RTMP. Now I am a bit confused about this article unless DaCast is rolling out HLS support soon for live streaming?

    • DaCast says:

      When live streaming as “HTML5” (in opposition to Flash), we ingest an RTMP signal and convert it to HLS for the delivery already! That alone explain the 20 seconds delay between the broadcast and what is shown in the player. This is mandatory for anyone willing to broadcast to smartphones and tablets.
      In short: HLS is here at DaCast for years, but happens via a RTMP ingest (HLS encoders being way too overpriced for the moment. We are still discussing an alternative here, to ingest HLS and therefore reduce a lot the delay)

  3. Gordon Barker says:

    Many thanks very informative and really appreciate your article can we get more technical around HLS streaming such as what streaming Servers can be used for streaming and what client software can be used?

  4. John Parulis says:

    Dacast excels in the field of live stream services with great tech articles like this. Good to have an understanding of HLS….though I wish the latency could be improved a bit. Thanks guys! My clients are happy with our streaming work, thanks in part to you.

Leave a Reply

Your email address will not be published. Required fields are marked *