W3C home > Mailing lists > Public > whatwg@whatwg.org > March 2010

[whatwg] Multiple file download

From: Philip Taylor <excors+whatwg@gmail.com>
Date: Wed, 10 Mar 2010 18:32:23 +0000
Message-ID: <ea09c0d11003101032g524e6a6ak398a7f5e0631c162@mail.gmail.com>
On Wed, Mar 10, 2010 at 5:51 PM, Eric Uhrhane <ericu at google.com> wrote:
> On Wed, Mar 10, 2010 at 12:28 AM, timeless <timeless at gmail.com> wrote:
>> http://www.pkware.com/documents/casestudies/APPNOTE.TXT V. General
>> Format of a .ZIP file
>> the zip format is fairly streaming friendly, the directory is at the
>> end of the file. And if you're actually generating a file which has so
>> many records that you can't remember all of them, you're probably
>> trying to attack my user agent, so I'm quite happy that you'd fail.
> Isn't a format that has its directory at the end about as
> streaming-UNfriendly as you can get? ?You need to pull the whole thing
> down before you can take it apart. ?With a .tar.gz, you can unpack
> files as they arrive.

Each file's compressed data is preceded with a header with enough
information to decompress it (filename etc), and then that information
is duplicated in the central directory at the end, so I believe you
can still do streaming decompression (as well as doing random access
once you've got the directory). And you can still do streaming
compression without even buffering a single file, by setting a flag
and moving a part of the file header (lengths and checksum) to just
after the compressed file data.

(But I never understood why pkunzip asked me to put in the last floppy
disk of a multi-disk zip before it would start decompressing the first
- maybe there's some reason that streaming decompression doesn't quite
work perfectly in practice?)

Philip Taylor
excors at gmail.com
Received on Wednesday, 10 March 2010 10:32:23 UTC

This archive was generated by hypermail 2.4.0 : Wednesday, 22 January 2020 16:59:21 UTC