W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 2013

Re: Multi-GET, extreme compression?

From: Nico Williams <nico@cryptonector.com>
Date: Mon, 18 Feb 2013 22:24:14 -0600
Message-ID: <CAK3OfOhExEv53m535CxKJsdsSuXCazYxyUgTNxnwiz+r+njfCw@mail.gmail.com>
To: Mark Baker <distobj@acm.org>
Cc: Phillip Hallam-Baker <hallam@gmail.com>, "ietf-http-wg@w3.org Group" <ietf-http-wg@w3.org>
On Mon, Feb 18, 2013 at 9:53 PM, Mark Baker <distobj@acm.org> wrote:
> On Sat, Feb 16, 2013 at 1:01 PM, Phillip Hallam-Baker <hallam@gmail.com> wrote:
>> HTTP 1.1 has a request/response pattern. This covers 90% of needs but means
>> that if the protocol is followed correctly forces a round trip delay on each
>> content request.
> It doesn't force it; you can pipeline the requests.

In the context of web pages, the first GET is for the HTML of the
page, and only after the browser begins (or even completes) parsing of
that do the subsequent GETs get issued.

That's a round trip just to get started.

Why couldn't a browser do a GET of the page and also say "also send me
all the stylesheets, scripts, and images you have that this resources
refers to"??  Seems utterly reasonable to me.

All we need is a way to classify which related resources to fetch
immediately without further requests and which not to.  For web pages
the classification is probably something like: stylesheets, scripts,
images, audio, and audio/video -- add resource size thresholds too, or
perhaps a way of saying "send me any of these types of resources that
you think I'll need".  We may need a way to express profiles instead
(text-based browser, sight-impaired user, hearing-impaired user,
scripts disabled, ...).  These MGET profiles might be very
HTML-specific, but we might be able to generalize them enough too.

Received on Tuesday, 19 February 2013 04:24:39 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 11:11:10 UTC