W3C home > Mailing lists > Public > public-html-bugzilla@w3.org > April 2011

[Bug 12569] "Resource" Package Support

From: <bugzilla@jessica.w3.org>
Date: Fri, 29 Apr 2011 22:27:12 +0000
To: public-html-bugzilla@w3.org
Message-Id: <E1QFw9Y-0006ZR-1W@jessica.w3.org>

--- Comment #4 from Kyle Simpson <w3c@getify.myspamkiller.com> 2011-04-29 22:27:11 UTC ---
(in reply to comment #3)

> Also, the problem with waiting for MBs
> to download is easily solved by processing the package as it downloads, though
> that does not help if the main CSS file is placed at the end of the package. 

This doesn't at all address the fact that the ZIP file has to be transferred
serially, byte-by-byte, meaning the files are in fact NOT downloaded in
parallel, which creates a situation where the page will load MUCH slower than
it normally would, by disabling all parallel downloading.

As for your idea of having the files download in parallel somehow... the whole
point of having a single file is to prevent multiple files from being
delivered. I don't think you can have it both ways.

Now, if you're suggesting that some sort of "manifest" file could be delivered
to the browser to tell it what all the files in the ZIP are, sure... but how
will that help at all, if the browser still has to wait for each file to show
up as part of the single ZIP file stream?

What we'd REALLY need for an idea like this to not hamper performance is for
the browser (and server) to do some sort of parallel bit downloading of a
single file, similar to bit-torrent kind of thing, where a single giant ZIP
file could be delivered in simultaneous/parallel chunks, bit-by-bit. If you
wanna argue for THAT, sure, go ahead. That'd be awesome. But it's a LONG way
from being possible, as all web servers and all web browsers would have to
support that. If either a webserver or a browser didn't, and the HTML served up
suggested that single manifest.ZIP file, then this view of the site would be
excruciatingly slow because all resources would default to the serial loading,
worse than like IE3.0 kinda days.

Moreover, the separate cacheability is a concern. If I have a big manifest.ZIP
file with all my resources in it, and I change one tiny file in that manifest,
then doesn't the entire ZIP file change (it's file signature/size/timestamp
certainly does). So, the browser has to re-download the entire ZIP file. Lots
and lots of wasted download of stuff that didn't change.

Unless you're really suggesting that browser and server need to be able to
negotiate on a single pipeline streaming connection, wherein all resources can
be piped through the one connection. But then we're back to serial loading,
which is *much* slower (even though you save the overhead of the additional
HTTP request/response).

All in all, I think this idea is flawed from the start. I think it has no
chance of actually working to improve performance in the real world. There's
just too many fundamental paradigm problems of packaging files up together that
loses the huge performance ability of files to be separately loaded in parallel
and separately cacheable. Any paradigm where you lose those two things is just
not going to work.

NOW, if we're merely talking about saving the TCP overhead of establishing a
new connection for each file (but that still files would be requested
individually, and in parallel), then that IS something valuable. And it already
exists and is in wide-spread use. It's called "Keep-Alive".

Configure bugmail: http://www.w3.org/Bugs/Public/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the QA contact for the bug.
Received on Friday, 29 April 2011 22:27:14 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 20:01:47 UTC