Re: [OT] Suggestion: http request bundle

> Since browsers are currently configured to load HTML pages in a rather
> slow and piecemeal manner, one can obtain performance improvements by
> reducing the number of JS and CSS files required on initial load.
> 
> So my comment was simply that Doug's suggestion about reducing the
> number of individual files an HTML file is dependent, actually has
> performance benefits.

His proposal for doing so was to do so at the HTTP level, in a way
which would require every browser, proxy and server to change.  I was
pointing out that if the developers thought it important enough, they
could achieve this with existing servers and clients, although HTTP 1.1
proxies are relatively thin on the ground.

> These benefits to HTML page loading seem both on-topic, and don't
> require reconfiguring every server, proxy and browser...which you seem
> to be suggesting is "all you need to do"! :)

As proposed, it required not only configuring, but modifying the code.

One could conceive of a <link rel="resourcelibrary" element that provided
an archive of the static resources for the page, and meta data to allow
the browser to determine when they became out of date.  If you used .zip
type formats, where the metadata is uncompressed, that wouldn't offer much
performance advantage over proper implementation of HTTP/1.1, and it would
significantly increase the the overhead on proxies, as they would have
to transfer the complete library when any component changed, and, unless
they were modified to understand the format (which would be a layering
violation of HTTP) would also have to store copies of the individual
resources, many of which would be images, and therefore already
compressed.

The ZIP format, itself, isn't particularly satisfactory, as the main
location for the metadata is at the very end of the file, so one would
have to rely on the recovery information if one wanted to display 
images as soon as possible (which, given the general failure of authors
to use alt attributes properly, tends to be quite important).

Also, my impression is that images and out of line scripts do often
considerably exceed the main content page in size.  About the only reason
for this not being true is that very few authors actually separate 
behaviour and styling from content, so content pages are often huge.

Generally, though, given the endemic, and often pointless, uses, by web
authors, of techniques that frustrate caching, and other inefficient
practices, I don't believe that authors are concerned about download
time.

Until authors stop serving cookies with every image, embedding large
amounts of styling in every element open tag, starting every page with a
huge EcmaScript library, and adding meta lines to every HTML file with the
intention, and probably effect, although often without any standardised
meaning, of frustrating caching, I don't think it is worth considering
other browser and server changes to optimise downloads.

(There is also nothing to stop pages being in .mhtml format, although
I've not checked to see whether this works for remote resources.)

Received on Sunday, 5 November 2006 14:02:43 UTC