Using HTTP Live Streaming (HLS) for live and on‐demand audio and video
This article introduces most relevant aspects of HTTP Live Streaming (HLS) and powerful toolchains provided by Apple for live and on-demand media.
On-demand content is certainly on the rise and whether it is just audio or even video, streaming technologies are at the core of any modern multimedia application or service. Among the many different technical formats to choose from, HTTP Live Streaming (also known as HLS) is a popular format for HTTP-based adaptive bitrate streaming. It was developed by Apple and released in 2009. As of 2017, the 7th version of the protocol is described as the RFC 8216 standard.
Similar to other standards, it works by breaking media streams down into small media segments in form of a sequence of small HTTP-based file downloads. As such, it uses an extended M3U playlist file to direct the client toward the media segments. Video files can be codified using H.264 format and audio is supported with AAC, MP3, AC-3 or EC-3. The stream is divided in segments of equal length and can also accommodate live event streaming by continuously adding new media segments to the playlist file.
On of the various beauties of this approach is that any media can be served through standard HTTP connections, such as widely adopted HTTP-based content delivery networks, and allows firewalls and proxy servers to be traversed. The client downloads the playlist file and thus can access the media segment files in sequence as directed through the playlist manifest and assembles the media file as it is played to the user.
Now let's have a look at Apple's HLS resources and how it works in detail. Become a free member or log in to proceed.