Low latency is a universal aspiration in media. When a company like Wowza produces the perfect chart to explain low-latency streaming technologies, you have to take your hat off to them, and use the chart, with attribution, and some minor modifications. I present said chart as Figure 1; let’s discuss in the order designated by the highlighted numbers which I’ve added.
1. Applications for Low Latency
Best PTZ cameras for live streaming
Just to make sure we’re all on the same page, latency in the context of live streaming means the glass-to-glass delay. The first glass is the camera at the actual live event, the second is the screen you’re watching. Latency is the delay between when the appears in the camera and when it shows up on your phone. Contributing to latency are factors like encoding time (at the event), transport time to the cloud, transcoding time in the cloud (to create the encoding ladder), delivery time, and critically, how many seconds your player buffers before starting to play.
The top layer shows typical applications and their latency requirements. Popular applications missing from low latency and near real time latency are gambling and auction sites.
Before diving into our technology discussion, understand that the lower the latency of your live stream, the less resilient the stream is to bandwidth interruptions. For example, using default settings, an HLS stream will play through 15+ seconds of interrupted bandwidth, and if it’s restored at that point, the viewer may never know there was a problem. In contrast, a low-latency stream will stop playing almost immediately after an interruption. So, the benefit of low-latency startup time always needs to be balanced against the negative of stoppages in playback. If you don’t absolutely need low latency it may not be worth sacrificing resiliency to get it.
That said, it’s useful to divide latency into three categories, as follows.
Audio + Video + IT. Our editors are experts in integrating audio/video and IT. Get daily insights, news, and professional networking. Subscribe to Pro AV Today.
Nice to have \- Faster is always better, but if you’re live-streaming a Board of Education meeting or high-school football game, you may decide that stream robustness is more important than low latency, particularly if many viewers are watching on low bitrate connections.
Competitive advantage \- In some instances, low-latency provides a competitive advantage, or slow latency means a competitive disadvantage. You’ll note in the chart that the typical latency for cable TV is around five-seconds. If your streaming service is competing against cable, you need to be in that range, with lower-latency providing a modest competitive advantage.
Real-time communications \- If you’re a gambling or auction site, or your application otherwise requires real-time communications, you absolutely need to deliver low latency.
Live streaming comparison guide
Now that we know the categories, let’s look at the most efficient way to deliver the needed level of low latency.
2/3 Nice to Have Low Latency
The number 2 shows that Apple HLS and MPEG-DASH deployed using their default settings results in about 30 seconds of latency. The numbers are simple; if you use ten-second segment sizes and require three segments to be in the player buffer before playback starts, you’re at thirty seconds. In truth, Apple changed their requirements from ten seconds to six seconds a few years ago, and most DASH producers use 4-6 second segments, so the default number is really closer to 20 seconds.
Still, number 3, HLS Tuned and DASH Tuned, shows latency of around 6-8 seconds. In essence, tuning means changing from 10-second segments to 2-second segments which, applying the same math, delivers the 6-8 seconds of latency. So, when latency is nice to have, you can cut latency dramatically with no development time or cost, or significantly increased deliverability risk.
4. Competitive Advanta...