W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 2008

Re: PROPOSAL: Weak Validator definition [i101]

From: Robert Siemer <Robert.Siemer-httpwg@backsla.sh>
Date: Mon, 17 Mar 2008 03:44:29 +0100
To: Werner Baumann <werner.baumann@onlinehome.de>
Cc: ietf-http-wg@w3.org
Message-ID: <20080317024429.GA2405@polar.elf12.net>

On Sun, Mar 16, 2008 at 10:35:33PM +0100, Werner Baumann wrote:
> Robert Siemer wrote:

> >There are. At least some of my CGI scripts use them. - I would not 
> >discard that many other CGIs do the same.
> >
> >To see no useful weak etag implementations within the static file 
> >serving code among common servers does not surprise me at all. - How 
> >should they know about semantic equivalence?
> >
> >I still don't know why this mecanism has to be an illusion. 
> >
> I don't say, it *has to be* an illusion. I say it *is* an illusion, when 
> confronted with current practice. And the spec is self-contradictory, 
> because it contains two mutual exclusive definitions of weak etags.
> You can resolve this to either side. But the only realistic way seems to 
> be to adapt the spec to current practice.

Current practice is to deliver weak etags that never match later on. 
These are based on "weak last-modified" dates. I hope that this 
useless practice ("we always generate ETags") never makes it into the 

As clients will do the same independent on which side we pull, I don't 
see "semantic equivalence" already lost.

Received on Monday, 17 March 2008 02:43:44 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 11:10:45 UTC