Let's talk about HTML5 live streaming. There are two competing technologies for it: DASH and HLS. They are terrible.

When both were implemented, there already existed several established, simple, standardized, and broadly supported options for live streaming video. Browsers wanted none of it.

Instead, Apple came up with HLS. It splits your video and audio into separate streams and then cuts them up into N second chunks (usually 5-30 seconds). Then it uses JavaScript to poll a playlist file for the list of upcoming segments, then munges them into a non-live <video> and <audio> element in real time.

This garbage is patent encumbered.

Therefore, Google made basically the exact same thing except they used XML, fucking XML, for the manifest.

This is what's happening behind the scenes of all live video you watch on the internet. This is also in no small part responsible for livestreaming video murdering any low-powered devices you try to play it on. Because JavaScript is king and standards are for losers.

Just imagine, just for a brief moment, the sheer scale of reliability and syncronization problems inherent in this design.

@sir Well HTML is dead as a standard since they made it into a "Living Standard".

@lanodan @sir my favourite thing about the html5 living standard is that code written against it is not forward compatible!

@Wolf480pl @sir More like necromancy (as in dead puppeteer) but I guess it depends on your definition of zombie.
@sir We actually had use for the caching/CDN properties of it all.

@sir You missed the reason why they exist - because web people wanted to use their existing caching infrastructure for streaming.
Guess what - they don't use files at all anymore and keep all chunks in memory, since otherwise the latency is too great. Something that could have easily been done with existing UDP streaming setups long ago - just cache GOPs.
So now the reason why streaming sucks is purely historical.
Oh, and here's a depressing bit of news: Twitch are planning to replace RTMP for user->server streaming with WebRTC. Something that's now xml or m3u based but is just as bad. Not something rational like RIST.

@lynne @sir
How does beam^Wmixer's FTL protocol compare to this mess?

@Wolf480pl @sir FTL is weird. First, on the streamer side: they've replaced RTMP with a minimal container that does not support reordering or DTS, hence the "Mixer doesn't support bframes. So force them off." comment in the OBS source. So that's a lot of compression gains wasted for a negligible decrease in latency. And they hardcode H264, while one of the main reasons to move away from RTMP was to have support for non-H264 codecs. However, they also hardcode Opus, which is nice, instead of AAC. So its weird, but I don't like it because no specs, hardcodes codecs, and doesn't give you a few bytes to put a DTS in to support frame reordering (they'll say its a feature).
I don't believe there ia a client side FTL protocol, just streamer side. For client side they probably do the same thing Twitch does and stream HLS with Twitch's extension for low latency.

@lynne @sir
>stream HLS with Twitch's extension for low latency

That doesn't make sense, was the first to have low latency, twitch only caught up later. So either the extension you're talking about isn't Twitch's, or beam's advantage came neither from streamer side nor from viewer side protocol, but from their CDN...

@Wolf480pl @sir I called it the Twitch HLS extension because their name is literally on it: "#EXT-X-TWITCH-PREFETCH", not because they were the first.

@lynne @sir
So I guess beam's initial advantage must've been in their CDN?

@Wolf480pl @sir Correction: its actually WebRTC with some extensions.
How does it compare? Its WebRTC.
I would have seen it if some streamer I checked used it, but no one was using it when I checked today for typical HLS traffic, and I didn't notice the huge library included as an OBS git submodule.

@sir Yeah unfortunately I'm forced to use HLS for streaming because my upload is very instable and HLS buffers are long... long... oh so so long.

@sir @tindall not that it makes it that much better but most browsers have native support for the HLS m3u8 playlists now, so there’s no need for the JavaScript munging.

But what makes it worse again is how adaptive-bitrate playlists were implemented.

@sir i’m giving up on anything being sane in current year. i guess i should pursue NDN and other research tech and build stuff on that: fuck interop with this insanity tbfh

@xj9 @sir I think you can still do sane things but just… stay as far as possible from the web, which isn't really something new.

@lanodan @sir word. i’m trying to move away from web tech professionally. i used to be a pretty big web enthusiast, but the more i’ve studied CS and tried to git gud at computer the less i care for it.

@sir There is a niche third option:
Continuous data stream. It's been around since the 90s.
Icecast implements it and currently officially supports WebM (VP8 or VP9; Vorbis or Opus) and Ogv (Theora + Vorbis in Ogg).
Most browsers should actually be able to play those if pointed to in a <video> element.
No JS needed. Admittedly has some downsides, like not dealing gracefully with connectivity changes or bandwidth fluctuation.

disclosure: I work on Icecast

@sir @jalcine RTSP may have been the nicer design on paper, but in practice, I’ve found HLS to be far more reliable. This probably comes down to fast, cacheable and non-firewalled HTTP being broadly available.

I wouldn’t blame the browser folks for this one. _Maybe_ admins/IT, who have a long-standing “everything that isn’t port 80/443 is poorly understood by me and therefore bad” culture that has ruined many protocols.

HLS and DASH are pragmatic.

Sign in to participate in the conversation

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!