W3C home > Mailing lists > Public > ietf-http-wg@w3.org > July to September 2007

Re: Suggestion for NEW Issue: Pipelining problems

From: Travis Snoozy <ai2097@users.sourceforge.net>
Date: Tue, 17 Jul 2007 08:54:29 -0700
To: Eric Lawrence <ericlaw@exchange.microsoft.com>
Cc: "Yngve N. Pettersen (Developer Opera Software ASA)" <yngve@opera.com>, "ietf-http-wg@w3.org" <ietf-http-wg@w3.org>
Message-ID: <20070717085429.75b59596@localhost>

On Tue, 17 Jul 2007 08:31:14 -0700, Eric Lawrence
<ericlaw@exchange.microsoft.com> wrote:

> I think Yngve is primarily pointing out that a significant body of
> the deployed base of HTTP servers do not support pipelining.
> Unfortunately, there's no reliable method, a priori, to determine
> whether or not the remote server is going to handle pipelining
> correctly, or whether it will either crash or deliver incorrect
> results.
> I don't know that ambiguities or inaccuracies in the RFC are to
> blame-- I think the problem is that most servers never bothered to
> try to handle pipelining correctly.


> Proposed modifications I've seen to help reduce the need for client
> heuristics include:


Marking a message "HTTP/1.1" should do that -- clearly, it hasn't
stopped folks from doing an abysmal job at implementing this
and other features. Likewise, nothing prevents this exact same
problem from re-occurring, with badly-behaving servers claiming to
support pipelining via any new identification mechanism proposed[1].

Really, I think that the best solution for this (and many other
HTTP-related problems) is to have a certification process, and
(ideally) a reference implementation. For the server validation at
least, it could even be an automated service (like the W3C (X)HTML


[1] The suggestion does, however, open up the interesting notion of
modularizing HTTP -- which would be nifty, but probably something
more suited for a 2.0 revision.
Received on Tuesday, 17 July 2007 15:54:51 UTC

This archive was generated by hypermail 2.4.0 : Thursday, 2 February 2023 18:43:14 UTC