W3C home > Mailing lists > Public > ietf-http-wg@w3.org > January to March 2008

Re: PROPOSAL: Weak Validator definition [i101]

From: Mark Nottingham <mnot@mnot.net>
Date: Mon, 17 Mar 2008 17:37:20 +1100
Cc: Werner Baumann <werner.baumann@onlinehome.de>, ietf-http-wg@w3.org
Message-Id: <6890A2BC-C096-49F6-A0F9-F41254B4BFE5@mnot.net>
To: Robert Siemer <Robert.Siemer-httpwg@backsla.sh>

Hmm. Doesn't resolving it on the side of keeping semantic equivalence  
beg the question of what semantic equivalence is -- and is there any  
way to define it except in a server-specific fashion?

On 17/03/2008, at 1:44 PM, Robert Siemer wrote:

> On Sun, Mar 16, 2008 at 10:35:33PM +0100, Werner Baumann wrote:
>> Robert Siemer wrote:
>>> There are. At least some of my CGI scripts use them. - I would not
>>> discard that many other CGIs do the same.
>>> To see no useful weak etag implementations within the static file
>>> serving code among common servers does not surprise me at all. - How
>>> should they know about semantic equivalence?
>>> I still don't know why this mecanism has to be an illusion.
>> I don't say, it *has to be* an illusion. I say it *is* an illusion,  
>> when
>> confronted with current practice. And the spec is self-contradictory,
>> because it contains two mutual exclusive definitions of weak etags.
>> You can resolve this to either side. But the only realistic way  
>> seems to
>> be to adapt the spec to current practice.
> Current practice is to deliver weak etags that never match later on.
> These are based on "weak last-modified" dates. I hope that this
> useless practice ("we always generate ETags") never makes it into the
> spec!
> As clients will do the same independent on which side we pull, I don't
> see "semantic equivalence" already lost.
> Robert

Mark Nottingham     http://www.mnot.net/
Received on Monday, 17 March 2008 06:38:20 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 1 March 2016 11:10:45 UTC