W3C home > Mailing lists > Public > w3c-dist-auth@w3.org > January to March 2002

RE: Interest in standardizing Batch methods?

From: Lisa Dusseault <lisa@xythos.com>
Date: Tue, 8 Jan 2002 21:39:06 -0800
To: "Julian Reschke" <julian.reschke@gmx.de>, "Greg Stein" <gstein@lyra.org>
Cc: "WebDAV" <w3c-dist-auth@w3.org>
Message-ID: <HPELJFCBPHIPBEJDHKGKGENLDDAA.lisa@xythos.com>

> Yet, the *server* will still see separate requests, and at this point it
> would be hard to actually detect that all these requests have something in
> common and may be internally optimized (for instance by doing just one
> instead of many calls to the database).

The HTTP 1.1 spec (RFC2616) isn't very clear on how servers are supposed to
handle pipelined requests, other than it must return the responses in the
same order as the requests.  However, it would seem very dangerous to try to
detect whether a bunch of pipelined requests can be optimized.  Might that
not have a different end-result than if they were handled one-by-one?  Could
the first PROPPATCH in a pipeline trigger an event which causes a later
PROPPATCH in the same pipeline to behave differently, were it to be handled
independently?  This is not just my worry -- it's also in Krishnamurthy and
Rexford (Web Protocols and Practice).

WP&P also points out that
 - pipelined requests suffer from head-of-line blocking, while multiple
requests over separate cxns does not
 - a closed connection is more severe when pipelining is used because the
client must remember a lot more state to recover from the closed connection

Lisa
Received on Wednesday, 9 January 2002 00:40:54 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 2 June 2009 18:43:59 GMT