Concurrent updates and proxies

I'm working on a system that has a REST interface used to exchange, 
among other things, configuration documents.

The interface used to be a CGI, for many reasons I wrote a bare bones 
HTTP server, I can now attempt to make the interface more RESTful.

Specifically, the CGI environment doesn't let me deal with ETags and 
preconditions (If-Match, If-None-Match), so the configuration documents 
in my system contain a sort of CRC of the document, to solve the 
concurrent update problem (CRC in submitted document doesn't match 
stored version = reject).

With my web server I'm planning on using ETags and preconditions, as 
outlined in:

http://www.w3.org/1999/04/Editing/

I think I have the mechanics clear, the client GETs a document, 
modifies the document, PUTs it back with an If-Match with the original 
ETag. The server rejects the PUT unless the string supplied through 
If-Match actually matches.

I have been recently bitten by Squid downgrading a request to HTTP/1.0, 
in a bug in my code I would respect the "Expect: 100-continue" header 
even though the incoming request was HTTP/1.0 and emit a 100, which 
Squid apparently doesn't really know what to do with, confusing the 
original client.

Now, if my configuration document update process requires and relies on 
HTTP/1.1 features like If-Match, what should I do with proxies like 
Squid that downgrade the connection to HTTP/1.0?

Squid would probably leave the If-Match header in place, as it left the 
Expect header. Can I rely on this to use the If-Match anyway and 
simulate the precondition failure through a generic 400 after the PUT 
(instead of a PUT-aborting 412)? Or will some HTTP/1.0 proxy 
resynthetize the headers and drop the unknown ones, thus breaking an 
HTTP/1.1-specific application?


I'm a newbie here, let me know if this question is more appropriate for 
a REST-specific mailing list.

Thanks,
Duncan

Received on Tuesday, 22 July 2003 19:27:40 UTC