Re: Multi-GET, extreme compression?

On Mon, Feb 18, 2013 at 9:53 PM, Mark Baker <distobj@acm.org> wrote:
> On Sat, Feb 16, 2013 at 1:01 PM, Phillip Hallam-Baker <hallam@gmail.com> wrote:
>> HTTP 1.1 has a request/response pattern. This covers 90% of needs but means
>> that if the protocol is followed correctly forces a round trip delay on each
>> content request.
>
> It doesn't force it; you can pipeline the requests.

In the context of web pages, the first GET is for the HTML of the
page, and only after the browser begins (or even completes) parsing of
that do the subsequent GETs get issued.

That's a round trip just to get started.

Why couldn't a browser do a GET of the page and also say "also send me
all the stylesheets, scripts, and images you have that this resources
refers to"??  Seems utterly reasonable to me.

All we need is a way to classify which related resources to fetch
immediately without further requests and which not to.  For web pages
the classification is probably something like: stylesheets, scripts,
images, audio, and audio/video -- add resource size thresholds too, or
perhaps a way of saying "send me any of these types of resources that
you think I'll need".  We may need a way to express profiles instead
(text-based browser, sight-impaired user, hearing-impaired user,
scripts disabled, ...).  These MGET profiles might be very
HTML-specific, but we might be able to generalize them enough too.

Nico
--

Received on Tuesday, 19 February 2013 04:24:39 UTC