Last week, Beam launched our first public iteration of our fully HTML5 Light Player. It’s an industry first, and it embodies a significant amount of technical innovation, allowing us to further customize and enhance the viewing experience.
This platform will enable us to deeper integrate video streams and other components such as chat, polls, and other secrets in ways that would have been very difficult with Flash. We’ll be able to continue to reduce stream delay, and use even fewer resources while doing it.
A Tale of Two Protocols
Since the dawn of time computing, two protocols have ruled the live streaming world: RTMP, and HLS.
Before being acquired by Adobe, Macromedia developed RTMP as a proprietary protocol for streaming audio and video over “raw” sockets. In recent years, Adobe has released an incomplete specification that has prompted wider adoption of the protocol. RTMP is what was used to serve video on Beam up until now.
HLS is a more modern release from Apple and is the format used by Twitch and many of our competitors. It works by sending the stream as a sequence of HTTP downloads, similar to how you would download a regular video. If you open your developer tools (F12
on most browsers), you can watch it downloading chunks of video:
Neither of these protocols is without faults. RTMP allows for a short delay, but its propriety nature combined with the necessity for low-level socket work means that browser plugins like Flash are required to play it. HLS doesn’t suffer from these restrictions. It’s fairly easy to make a native player that runs HLS, or its close cousin, MPEG-DASH, but chunking video and serving it over HTTP introduces an inherent delay and increases bandwidth usage.
Enter Light
Light currently consists of two parts: the Light Player (what you see in your browser), and Light Dist (a server that packages video to ship to you). It’s made possible through a number of new technologies, namely media source extensions, and binary websockets.
The light protocol works by streaming raw MPEG atoms down a binary websocket. The player recognizes those atoms and appends them to a buffer that can be used to play video. This method allows us to combine the advantages of HLS (namely native browser support) with lowerlatency than RTMP. We’ll have blog posts over the following weeks discussing some of the exciting technology and networking that’s making this happen, and where we’re going from here.
Light is yet young, but here are some cool stats about our player and server:
Player:
- Browser latency: 120ms
- CPU time: 2.8%
- Average Loading Time: 0.9 seconds
Server:
- Additional latency: 0.30 ms (currently ffmpeg adds additional ‘external’ delay, but our reliance on that is being removed)
- RAM usage per 10,000 viewers: 1.6 GB
- Supported concurrent viewers: 50,000 per CPU core per server
From Here…
Most significantly, we now have 100% control of video from the time it leaves the streamer’s computer to the time it plays in the browser. Expect to see rapid development of the player and interface, reduced delay, and more intricate integration of the video stream and Beam site features (hint, hint, co-streaming).
We’d love to hear from you regarding any issues or suggestions you may have, just tweet to us at @WatchBeam. You’re also welcome to join us here over the coming weeks for blog posts going into greater depth about the technology that makes Light happen.
via Beam