- From: Martin Lasak <notifications@github.com>
- Date: Fri, 14 May 2021 01:45:39 -0700
- To: whatwg/streams <streams@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
- Message-ID: <whatwg/streams/issues/1126/841106318@github.com>
@ricea that's was the motivation for raising this issue. On client side it should be possible to detect if transmission is ongoing or idle. My hope (and recommendation/suggestion) is that this missing piece in form of an event (see below) will be added to the Streams spec. @MattiasBuelens Thank you for confirming the current (not very satisfying) situation. Our use case is exactly the one you have mentioned ;) The group I’m working for is the current maintainer of dash.js My task is to validate and improve the throughput calculation in low-latency streaming because we see that current implementations have issues. The paper you have mentioned I know very well, interesting work. However, this approach fails in throughput estimation in my simple code example above since all chunks are equal. This is the case when an encoder produces chunks at equidistant times, which is very likely in ULL in my opinion. Moreover, the authors of the paper themselves state what the missing piece in fetch api is. Actually, the authors write > with the standard HTTP protocol we have no means to determine the value for b but in fact they mean fetch api/streams api, as HTTP 1.1 standard specifies the the size of the chunk to be sent [1]. So why not fixing this Stream spec issue to allow for simple and exact measurement in future? An event announcing the start of chunk transmission will reduce the problem back again to the simple formula _transferred_bits/transmission_duration_. Interestingly, in Node.js this event already exists since a while, it is called ```readable``` [2] which offers this missing piece (note: I’ve changed above example to use this event name instead). Here an Node.js example consumer, that allows for exact throughput measurement without the need for any sophisticated calculation and predictions (make sure sender example runs from above example https://github.com/whatwg/streams/issues/1126#issuecomment-839151920): ```javascript const http = require('http'); let timeMark = Date.now(); let chunkCount = 0; http.get('http://localhost:3000/data', (res) => { // https://nodejs.org/dist/latest-v14.x/docs/api/stream.html#stream_event_readable res.on('readable', () => { console.log(`readable`); timeMark = Date.now(); res.read(); }); res.on('data', (chunk) => { console.log(`got ${++chunkCount}. chunk with ${chunk.length} bytes, in ${Date.now() - timeMark} ms`); }); res.on('end', () => { console.log('got all chunks'); }); }).on('error', (e) => { console.error(`Got error: ${e.message}`); }); ``` The result is what we expect and desire: ``` readable got 1. chunk with 1024 bytes, in 0 ms readable got 2. chunk with 1024 bytes, in 0 ms ... readable got 80. chunk with 1024 bytes, in 1 ms ... readable got 99. chunk with 1024 bytes, in 0 ms readable got 100. chunk with 1024 bytes, in 0 ms readable got all chunks ``` Making something like this available in Web browsers would have huge benefits. Wdyt? [1] https://datatracker.ietf.org/doc/html/rfc7230#section-4.1 [2] https://nodejs.org/dist/latest-v14.x/docs/api/stream.html#stream_event_readable -- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/whatwg/streams/issues/1126#issuecomment-841106318
Received on Friday, 14 May 2021 08:45:52 UTC