- From: Jimmy Wärting <notifications@github.com>
- Date: Sat, 09 Sep 2023 07:19:58 -0700
- To: whatwg/fetch <fetch@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
- Message-ID: <whatwg/fetch/issues/1524/1712522651@github.com>
Another use case could be to do partial download of something that's encoded and also supports range request.
Say that i want to download something really large. `content-encoding` and `content-length` is provided along with accept range response.
I initiate a call
```js
const response = await fetch(url, {
method: 'GET',
raw: true,
headers: {
'accept-encoding': 'gzip, deflate'
}
})
```
From now on i will know
- how much data needs to be downloaded over the network overall (thanks to content-length)
- weather or not it supports range request
- exactly how much raw data i have downloaded those far from the internet
- This way i can provide the user with an progress monitor of how much have actually been downloaded
the decompressed data dose not reflect that, it can exceed the `content-length`
but i will not know what the actual data is unless i pipe it to a `new DecompressionStream('gzip | deflate')`
```js
const progress = document.querySelector('progress')
const chunks = [] // ultra simple store
for await (const rawChunk of response.body) {
// show how much have been downloaded (not how much have been decompressed)
progress.value += rawChunk.byteLength
// store the chunks somewhere
chunks.push(rawChunk)
}
```
With this in place i can provide a good solution for failed downloads.
By calculating exactly how much i have downloaded.
That way i can make i range request and continue on from where i left of or when the connection failed.
This would also be a good solution for pausing / resuming a download.
now that i have all chunks then i can go ahead and decompress it using the `DecompressionStream`
unfortunately we lose some very useful stuff with this raw option. can't use brotli decoding (due to lack of support in decompressionStream) `text()`, `json()`, `arrayBuffer()` and `response.body` are not so useful anymore cuz it require more work afterwards.
another option would be to be able to hook in and inspect the data somehow before it's decompressed. so a alternative solution could be to do something like
```js
const response = await fetch(url, {
onRawData (chunk) {
// ...
}
})
// alternative considerations
const response = await fetch(url)
const clone = response.clone()
response.json().then(done, fail)
clone.rawBody.pipeTrough(monitor).pipeTo(storage) // consuming the rawBody makes `clone.body` locked and unusable.
```
So i can say that i found two additional use cases beside a server proxy. 1) progress monitoring, 2) resumable download
--
Reply to this email directly or view it on GitHub:
https://github.com/whatwg/fetch/issues/1524#issuecomment-1712522651
You are receiving this because you are subscribed to this thread.
Message ID: <whatwg/fetch/issues/1524/1712522651@github.com>
Received on Saturday, 9 September 2023 14:20:05 UTC