W3C home > Mailing lists > Public > www-tag@w3.org > February 2013

Re: Revisiting Authoritative Metadata (was: The failure of Appendix C as a transition technique)

From: Robin Berjon <robin@w3.org>
Date: Tue, 26 Feb 2013 00:22:00 +0100
Message-ID: <512BF218.6020308@w3.org>
To: "Eric J. Bowman" <eric@bisonsystems.net>
CC: "www-tag@w3.org List" <www-tag@w3.org>
Hi Eric,

On 25/02/2013 19:13 , Eric J. Bowman wrote:
> I'm referring to REST, a peer-reviewed thesis which forms the accepted
> science on this matter.  You're correct that stating "that's wrong" is
> hardly an argument; better if you could refer to falsification of said
> thesis?  I'm open-minded, but...

The REST thesis is a 180 pages long document, and while I admit that 
it's been a while since I last read it, I'm pretty sure that it says a 
few things other than some variation on "authoritative media types are 
really good". Keeping that in mind, if falsifying REST is what you 
expect me to do, then I would find it most helpfully urbane of you were 
you to point me to a specific section.

This discussion is taking place within the specific context of the 
Authoritative Metadata TAG finding, which contains two paragraphs about 
the proclaimed superiority of authoritative metadata over embedded typing:

     http://www.w3.org/2001/tag/doc/mime-respect#embedded

As explained in detail here:

     http://lists.w3.org/Archives/Public/www-tag/2013Feb/0130.html

neither of those paragraphs is in any way substantiated. They just 
proclaim their content with no reference to fact.

> Without falsification of REST, claims that it's wrong have about as
> much credibility with me as the intelligent design folks' denial of
> evolution.

That's an interesting claim. I happen to be particularly interested in 
how architectural/constitutional rules will orient an ecosystem towards 
certain stable situations (and how we can fix such rules rather than 
treat the symptoms). So allow me turn it this way: if the architectural 
principle you are defining is indeed conducive to robust protocols, how 
do you explain the persistence of sniffing as an evolutionarily stable 
strategy throughout the ecosystem and as amply evidenced in the (not so) 
fossil record?

> If you want to convince me, you'll need to resort to
> the methods and language of science.

I'll cite a few notions that I believe are useful:

• Ruby's Postulate:

The accuracy of metadata is inversely proportional to the square of the 
distance between the data and the metadata.

• Robin's Law of Error Correction:

When the cost of error is borne primarily by the client rather than the 
server, error-correcting clients will come to dominate the ecosystem 
over time.

• Robin's Law of No Web Police (I'm on a roll with the law tonight)

In the absence of a Web Police we have no choice but to build rules that 
contain within themselves the incentives to be followed.


I'm not sure that I can prove Ruby's Postulate, but would you disagree 
that it's borne out by ample experimentation?

I'm actually rather confident that given the time I could formally prove 
the Error Correction one. It's a little bit complex because it doesn't 
boil down to a simple iterated two-player game, but rather is the 
interaction of two different playing-the-field games (roughly in the 
sense that Maynard Smith uses in _Evolution and the Theory of Games_ but 
modulo the fact that I'm not entirely convinced yet that 
playing-the-field games should be handled as two-player games).

Anyway, to put it informally: on one side you have a client. It's 
competing for users (its fitness) with other clients. When the server 
sends an error and the client doesn't correct, the client loses a lot 
(whereas it wins both when the server sends correct data, and when it 
corrects errors). At the other side of the relationship, the server 
incurs losses for sending erroneous content that is proportional to the 
population of clients that interpret that content as an error.

So to put it simply: clients have a constant incentive to error-correct, 
whereas servers have an incentive to produce correct content that 
decreases with the population of clients with strict interpretation. I'd 
love to spend a day or so to take the time to properly prove this, but I 
think it's nevertheless intuitively obvious how such a system will evolve.

As for No Web Police, I believe it is Rule Zero for the evaluation of an 
architecture. People don't break things (on a large scale) for the fun 
of it. At a population level, systematic breakage is systemic breakage. 
It is always tempting to blame the actors, be it a Vast Browser-Wing 
Conspiracy, Eternally Dumb PHP Developers, or the perennial Uninformed 
Voter. Where browsers or web developers systematically break the rules 
there's an architectural problem. When PHP developers create yet another 
SQL injection, there's a language design problem. As for voters, well, 
that's for another mailing list :)


I can return to your other points later, but since the replies all stem 
from the above let's look at this first.

-- 
Robin Berjon - http://berjon.com/ - @robinberjon
Received on Monday, 25 February 2013 23:22:05 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 25 February 2013 23:22:05 GMT