Re: [whatwg/fetch] Odd format for fetch callbacks (#536)

> It's useful that fetch can be invoked from the main thread as a bunch of state 

Yes and I think we can rely on the difference between calling [dom-global-fetch](https://fetch.spec.whatwg.org/#dom-global-fetch) and calling [fetch](https://fetch.spec.whatwg.org/#concept-fetch). 

Actually the DOM fetch method calls the second one from "in-parallel" steps(see step 9). 

And so the background fetch spec defines another DOM method, that then eventually calls into fetch from parallel steps(after having setup some data "linking" the parallel steps with stuff back on the main thread, for example via a "background fetch record"). 

I think it's actually very similar to how the Service worker spec manipulates certain data, like the "service worker" from the parallel job queue steps, and then ties it back to the `ServiceWorker` DOM object via queuing tasks and so on. 

> If we want to support full duplex the model where you go in parallel, fetch, and then wait for the response, the body, and the trailers, doesn't work, as you wouldn't be able to send the request body and request trailers at the same time.

I think it could work based on the above, since the DOM `Request` and the body stream are on an event-loop, whereas parallel steps could call into fetch "blockingly", where the parallel stops block, but not the event-loop. I think this is essentially what the background fetch spec does(hope I'm understanding correctly what you mean wiht "full duplex" model). 

> the reason to use a parallel queue I guess is to ensure that you do not process the response body before the response headers? It might work I suppose, but it would have to be flushed out a bit to ensure the right information is passed around to be able to queue tasks from there and such. (Tasks need a bit more information these days.)

I was mostly thinking that the queue is a way to make clear that those callbacks are asynchronous from the perspective of the fetch. So in a way it's about fetch passing  a message to another algorithm. Currently it's expressed via queueing a task, however if we express it via enqueuing parallel steps, then it would be a better fit for the background-fetch spec. 

> ensure the right information is passed around to be able to queue tasks from there and such. (Tasks need a bit more information these days.)

I think this specifically can be addressed, like in background-fetch, by having the "other parallel steps" make some "non-DOM" data available to the callbacks like `process_response`, where this data would then be used to queue the appropriate tasks.  For example see https://wicg.github.io/background-fetch/#update-background-fetch-instances, which is called from within the ` process response` callback defined by https://wicg.github.io/background-fetch/#complete-a-record (which runs "in-parallel"). 

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/whatwg/fetch/issues/536#issuecomment-640447517

Received on Monday, 8 June 2020 08:21:41 UTC