Livery

The Ultra-Low Latency video streaming roadmap: from WebRTC to CMAF — part 2

The Ultra-Low Latency video streaming roadmap: from WebRTC to CMAF — part 2

All articles

This article is the second part of a two-part series that compares different types of streaming solutions and discusses the use of ultra-low latency CMAF (ULL-CMAF) created by Livery. In this article, we cover the basics of traditional HLS and DASH video protocols and explain how ULL-CMAF can improve latency working with the traditional protocols.

First, let’s talk about where video latency comes from, to set the scene for how we improved it.

What causes latency?

Latency happens for several reasons. Some of the main factors include:

  • The time it takes for the encoder to encode video and output small files called segments
  • Next, the time it takes for the segments to be uploaded to a CDN and propagated across its servers
  • And finally, the time it takes for the video player to download the segments.

When using the HLS or DASH protocols, it takes approximately 12 seconds from the moment a frame is provided to the encoder until it is available on an end user’s device. But the player will not start playback until it has downloaded several segments to avoid a disrupted feed. The HLS specification recommends downloading 3 segments before starting video playback, resulting in a latency of around 24 seconds.

The good news is that the segment size can be lowered to reduce the overall latency. But the bad news is that reducing the segment size increases the bitrate (and has other implications). To learn more about the technical challenges we overcame to introduce CMAF, read the full article here.

Category:

The Ultra-Low Latency video streaming roadmap: from WebRTC to CMAF — part 2

The Ultra-Low Latency video streaming roadmap: from WebRTC to CMAF — part 2

Share this article

The Ultra-Low Latency video streaming roadmap: from WebRTC to CMAF — part 2

The Ultra-Low Latency video streaming roadmap: from WebRTC to CMAF — part 2

The Ultra-Low Latency video streaming roadmap: from WebRTC to CMAF — part 2

Related articles

The Ultra-Low Latency video streaming roadmap: from WebRTC to CMAF — part 2

This article is the second part of a two-part series that compares different types of streaming solutions and discusses the use of ultra-low latency CMAF (ULL-CMAF) created by Livery. In this article, we cover the basics of traditional HLS and DASH video protocols and explain how ULL-CMAF can improve latency working with the traditional protocols.

First, let’s talk about where video latency comes from, to set the scene for how we improved it.

What causes latency?

Latency happens for several reasons. Some of the main factors include:

  • The time it takes for the encoder to encode video and output small files called segments
  • Next, the time it takes for the segments to be uploaded to a CDN and propagated across its servers
  • And finally, the time it takes for the video player to download the segments.

When using the HLS or DASH protocols, it takes approximately 12 seconds from the moment a frame is provided to the encoder until it is available on an end user’s device. But the player will not start playback until it has downloaded several segments to avoid a disrupted feed. The HLS specification recommends downloading 3 segments before starting video playback, resulting in a latency of around 24 seconds.

The good news is that the segment size can be lowered to reduce the overall latency. But the bad news is that reducing the segment size increases the bitrate (and has other implications). To learn more about the technical challenges we overcame to introduce CMAF, read the full article here.

Category:

Share this article

The Ultra-Low Latency video streaming roadmap: from WebRTC to CMAF — part 2

Related articles

Industry