Re: Multi-GET, extreme compression?

in the spirit of producing an ID that answers the big questions - I'm
a bit confused at what problem] a mget proposal is trying to solve
beyond what we've got in front of us now.. is it improved compression?

We already have (a few) schemes that do quite well on that count so it
isn't an existential problem that needs solving at any cost - and this
has a high cost associated with it. Frankly, its not clear to me that
the compression it gives would even be competitive with the delta
schemes - I'd like to see its proponents prove that out.

But beyond that it has costs derived from
* added latency to determine resource sets. That's pretty much a
non-starter for me because it conflicts with my core goals for the new
protocol
* reduced flexibility.. right now each resource has its own set of
headers which contain a lot of redundancy. The delta schemes preserve
that property while exploiting the redundancy. The mget scheme
requires all the resources in a set have the same headers, but in
truth small variations exist and are quite useful. Accept headers vary
by media type, Cookies vary, etc.. I'd hope an ID would look at page
loads that use those patterns.

Received on Monday, 18 February 2013 14:18:32 UTC