Latency is one of the toughest challenges in live delivery online. Typical delays can be upwards of 1 minute or more, which can be very annoying for viewers. Can this latency be reduced? Akamai has a solution which it says removes it entirely.
Chapter 1: Challenges or eliminating latency (0:40)
Akamai’s Shawn Michels talks about some of the challenges in reducing latency while maintaining broadcast quality.
Chapter 2: Reducing the latency from 60 seconds (1:50)
Mr. Michels says there are two key approaches necessary to reduce latency. The first is to make sure the streaming system receives the live feed from the master control system, not a downstream feed (like a local broadcast.) The second key approach is to make sure all the subsequent systems are built to handle very high velocity video traffic processing.
Chapter 3: Avoiding excessive buffering (3:30)
A lot of systems aren’t built from the ground up with live in mind. Mr. Michels talks about the many things that Akamai is doing in its mid-tier network to ensure high velocity processing of live streams, without excessive video buffering.
Chapter 4: Does it scale? (4:40)
It’s no good reducing latency if the live streaming event doesn’t scale. Mr. Michels described a recent multi-day event where the low latency solution was used. He says the system delivered a total of 98M streams, with 69M streams delivered over first 5 days. The peak concurrent usage was 1.2M streams. He said the solution excelled in 3 key areas for the is event: there was 100% success in live stream ingestion, availability approached broadcast quality, and rebuffer rate remained very low, ( less than 1%).