- From: Boris Zbarsky <bzbarsky@MIT.EDU>
- Date: Sat, 15 Mar 2014 00:04:57 -0400
- To: Devdatta Akhawe <dev.akhawe@gmail.com>
- CC: Mark Nottingham <mnot@mnot.net>, "public-webappsec@w3.org" <public-webappsec@w3.org>
On 3/14/14 10:50 PM, Devdatta Akhawe wrote: >> So the browser needs to both be streaming the compressed data to disk and >> teeing it off to a decompression stream which computes the hash as it goes, >> right? Or alternately decompressing and then recompressing? > > Well, we aren't supporting progressive hashes right now. I'd like to understand what that means. Does that mean the hash can't be computed in a streaming fashion, but actually needs the entire decompressed data in a single chunk (in memory?) to compute the hash? I'm really hoping I'm just misunderstanding this point.... > How about a hybrid with Mark's idea only for only the downloads case? > We default to undoing content-encodings and the developer can opt-in > to computing the hash on the gzip'ed file. In most cases, the > programmer won't need to specify anything extra but for .tar.gz file, > the programmer will need to add a "encoding=gzip" to the anchor link. > The spec can allow browsers to just delete the download if they don't > want to do unzip the document and the programmer forgot to give the > hash with encoding=gzip. This seems reasonable, with one caveat: I would prefer there be no optional behavior here. What the non-optional behavior should be depends on the above question about streaming vs not. -Boris
Received on Saturday, 15 March 2014 04:05:29 UTC