- From: Gregory Terzian <notifications@github.com>
- Date: Fri, 29 Nov 2019 05:38:29 -0800
- To: whatwg/fetch <fetch@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
- Message-ID: <whatwg/fetch/issues/976@github.com>
I have question regarding the use of streams in fetch, in the specific case of [`transmit the body`](https://fetch.spec.whatwg.org/#concept-request-transmit-body) of a request. So I think in all cases, when "transmit the body" is called, we are on "in-parallel" steps of the spec, hence I think we shouldn't be directly interacting with javascript objects, as described in https://html.spec.whatwg.org/multipage/#event-loop-for-spec-authors:in-parallel So I'm wondering how, when transmitting the body of a request from parallel steps, the spec then says to: 1. [get a reader for the stream](https://fetch.spec.whatwg.org/#concept-get-reader) 2. Start [reading chunks](https://fetch.spec.whatwg.org/#concept-read-chunk-from-readablestream) from it until it is done or errored. Wouldn't this require interacting with the corresponding javascript stream, which could consists of an arbitrary number of other streams piped/transformed into it? For example, let's say the JS wants to upload some data to a server, which would be the end-result of a pipeline of streams. When "transmitting the body" of the request, we would effectively have to pull chunks from that pipeline and therefore run the corresponding javascript on a event-loop, correct? Would it perhaps be more correct to queue a task on the event-loop which would start reading chunks from the stream, and for each chunk queue "in-parallel" steps back on fetch that would transmit those chunks over the network? -- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/whatwg/fetch/issues/976
Received on Friday, 29 November 2019 13:38:31 UTC