W3C home > Mailing lists > Public > ietf-http-wg@w3.org > October to December 2008

Re: Proposal for issue #101 (strong/weak validators).

From: Jamie Lokier <jamie@shareable.org>
Date: Sat, 15 Nov 2008 09:11:22 +0000
To: Henrik Nordstrom <henrik@henriknordstrom.net>
Cc: Werner Baumann <werner.baumann@onlinehome.de>, Yves Lafon <ylafon@w3.org>, ietf-http-wg@w3.org
Message-ID: <20081115091122.GA24047@shareable.org>

Henrik Nordstrom wrote:
> Yes, there is some cases where the simple algorithm used by Apache will
> fail and emit the same weak ETag for two quite different objects, but in
> real life use those is quite rare.

Rare, but the point of cache validators is to do what they are
specified to do all the time, not most of the time.

Imho, though, Apache's transmission of weak validators is reasonable
here; it's just an unusual interpretation of "semantic equivalence".
Just as long as nobody _uses_ them :-)

> In fact I would argue that it's probably more likely the content
> gets updated while being sent, making even their strong ETags
> "worthless", and the same for any server on any OS where files may
> be updated while read by another application unless you buffer the
> whole selected representation to calculate the ETag.

Sensible applications, on unix, replace files by renaming over the
original.  When a file is served, it is always the old contents or the
new contents, never a mix.

Not all applications are sensible, though, and you can't do this on Windows.

-- Jamie
Received on Saturday, 15 November 2008 09:11:58 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 11:10:47 UTC