nScreenMedia OTT multiscreen media analysis

nScreenNoise – Economically achieving TV scale with live online

nScreenMedia Video Podcast

How will online live events scale to television-sized audiences? Akamai’s Jim Ewaskiew tells us how the company is co-opting client devices to share the load and helping to save service providers money in the process.

Chapter 1: Delivering scale with quality (0:40)

Mr. Easkiew discusses how Akamai is using alternate IP protocols such as UDP to improve quality and reduce buffering in video playback. The company is also leveraging congestion control technology to help the video streams avoid congestion in the network. These two approaches are delivering 10-15% improvement in throughput and reductions in buffering.

Chapter 2: A two-tier architecture to get to scale (1:50)

Most streaming delivery systems rely on a server streaming the video to a single client. Akamai’s new technology leverages multiple sources to deliver the video. An origin server streams the video to a client, and the client can also stream the video it is receiving to other clients. This two-tier architecture allows clients to stream video from many different sources.

The client technology Akamai is using to stream video is called WebRTC. Most web browsers support this protocol.

Chapter 3: Improving the economics of streaming (3:30)

Leveraging the clients to share the streaming load means Akamai’s network does not have to deliver as much video. Mr. Ewaskiew says that Akamai will give customers a discount on their streaming charges if they elect to use the company’s multisource solution.

This browser-based solution is already in beta and should be in full production in time for the 2018 Olympics.

Facebooktwittergoogle_plusmailFacebooktwittergoogle_plusmail

(3) Comments

  1. Colin: Good Akamai interview on multi-source delivery. This is an old idea of course but using WebRTC now. Question is, what is the quality sacrifice when I get my stream from other (many, chained?) unreliable clients (intermittent, congested, turning on/off…). It seems a nightmare to get consistent high quality (4K rates) over many hours. Just wondering if you delved into this. I like the QUIC technology from Google — Chrome uses it everyday.. Thanks Al (Santa Clara)

    • You’re right, Al. The protocol is multi-sourced, not point to point. So, I think a client actually gets video packets from multiple other clients and assembles them into the stream. In other words, a single client isn’t streaming the video to another client, multiple clients are sending bits of the stream to another client. Sounds messy but works pretty well, and works better the more clients join the live stream. A very desirable characteristic when you’re trying to get TV scale. It has the added advantage of being quite resilient – when one client drops out another one is quickly added to fill the gap.

  2. Colin, thanks. The multi sender technique has interesting trade-offs. The receiver needs more buffering to account for the many ways things can go wrong. Plus, each receiver needs to have connections to many sources and retrieve some data it may discard–bandwidth overhead penalty? I do wonder what the buffer latency increase is for this method. Seems testing this is non trivial since there are so many different ways for the data to be blocked or slowed and only receiver buffering can hide loss especially under worst case conditions.. Al

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.