- From: Larry Masinter <LM@att.com>
- Date: Fri, 21 Jan 2000 11:11:51 -0800
- To: HTTP Working Group <http-wg@hplb.hpl.hp.com>
(mailing list name 'cuckoo' -> 'hplb' again). There's no point in restricting side effects in general; the idea was to restrict side effects that somehow mattered! And whether or not it matters depends on the application. For the traditional web browsing application, clicking on a link twice, refetching a web page, or even prospectively fetching a page that you think someone *might* want, using GET to mirror a site, etc ... those shouldn't have significant side effects. For interactions between subsequent requests and pipelining, since it is the client that decides whether to pipeline, it is also the client that has the responsibility for deciding whether pipelining matters. For web browsing, there's no real problem: the only state change where consistency matters is the interaction between the POST from a form being filled and the subsequent GETs that are triggered by embedded URLs in the content that is returned from the post. For other applications that are being built on top of HTTP, there must be some agreement between client and back-end application as to what the dependencies and transaction semantics must be; the HTTP server needn't enforce these, since the clients have complete control over whether they ATTEMPT pipelining. I suppose you might want to note that intermediaries shouldn't introduce pipelining (e.g., by prospectively guessing what URLs might appear in subsequent content and prefetching them!) but otherwise, this is a client, not a server, responsibility (IMO). Servers need not serialize processing pipelined requests; clients that care shouldn't pipeline. Larry -- http://larry.masinter.net
Received on Friday, 21 January 2000 11:16:10 UTC