- From: Benjamin Gruenbaum <notifications@github.com>
- Date: Thu, 23 Jun 2022 05:06:58 -0700
- To: whatwg/streams <streams@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
- Message-ID: <whatwg/streams/issues/1235/1164325839@github.com>
> If it's true that this is a server-focused problem and doesn't matter for browsers, and that solving it requires such a large and complicated bunch of machinery, then I am not personally optimistic about finding much engagement from the browser engineers you are pinging. First of all, thanks for looking into this. I've personally run into this a bunch when working on video in browsers. This was before ReadableStream was common enough to use (so we had to work on our own) but code that touches video in browsers can definitely run into this sort of issue. I actually haven't used ReadableStream enough in servers to ever run into this and performance sensitive code on servers often can't actually tee because of performance. On the client though the amount of back-pressure determines buffering/downloads which is really important for video. If we want a concrete (and rather common) example let's look at HLS/DASH playback (like on YouTube, Netflix or most video sites). Most of the code doing this doesn't use ReadableStream (yet) but this case is common: - I have a server giving me video. How much data I buffer is very important because it balances the quality of service for the users, the amount of data I download (especially on metered connections) and processing. - I have client code that takes the HLS/DASH video and demuxes it (since most browsers don't natively run HLS) - changing the container format from .ts to fragmented mp4. Using media source extensions it then pushes the video fragment to the video tag. Two examples of (popular) libraries that do this for HLS/DASH are hlsjs and shaka. One can think of this as a streaming transformation and the user actually consuming the video should impact backpressure and buffering. - I also want to look at the stream for analytics (how many frames, time watched etc). In this case for ReadableStream to be viable backpressure would need to work, buffering information (that may be thrown away if the rendition changes) because the analytics parsing is fast wouldn't work in terms of quality of service since the "real" pace is the user watching the video + the cost of demuxing it. There is a good picture illustrating Chrome's algorithm for mp4 (that again predates ReadableStream) - I really like that picture and use it when giving talks about watermarks and backpressure: ![image](https://user-images.githubusercontent.com/1315533/175294498-62aa8612-382c-4b73-a2eb-d0a081ad82a8.png) (From https://www.chromium.org/audio-video/#how-the-does-buffering-work ) ----- I am not arguing this is a bigger issue on the client or even that it's a very common one - but it's one that I personally ran into when working on video related tasks in browsers, it's one the browsers have to solve internally and I suspect it's slowing adoption of ReadableStream in that space (and I believe video is one of the most important use cases for ReadableStream) -- Reply to this email directly or view it on GitHub: https://github.com/whatwg/streams/issues/1235#issuecomment-1164325839 You are receiving this because you are subscribed to this thread. Message ID: <whatwg/streams/issues/1235/1164325839@github.com>
Received on Thursday, 23 June 2022 12:07:11 UTC