W3C home > Mailing lists > Public > public-webapps@w3.org > October to December 2011

Re: [widgets] How to divorce widgets-digsig from Elliptic Curve PAG?

From: Charles McCathieNevile <chaals@opera.com>
Date: Tue, 20 Dec 2011 18:49:05 +0100
Cc: public-webapps@w3.org
To: "Jean-Claude Dufourd" <jean-claude.dufourd@telecom-paristech.fr>, "Marcos Caceres" <w3c@marcosc.com>
Message-ID: <op.v6suf3o2wxe0ny@widsith-3.local>
TL;DR: JC and Leonard are right.

Pointing to a moving target makes any statement about conformance pretty
much unusable in the real world. Which is significantly worse than having
a statement of conformance to something known to contain errors and bugs.

Browsers don't implement "living standards". They implement something  
undefined for an open environment where people are continually innovating,  
and they make considerable and expensive efforts to tell the community  
what it is they implement.

Browsers like Opera who have commercial enterprise customers work with  
those customers to establish sensible conformance requirements that take  
into account the evolutionary nature of the web (rather than arbitrary  
stacks of requirements that happened to be easy to test), but neither side  
wants a Statement of Work that doesn't actually clarify what is expected  
to be delivered under a legally binding contract.

A lack of stability and precision about which version of a specification  
is meant also makes it harder to get a decent royalty-free license.

Details below...

On Mon, 19 Dec 2011 14:03:58 +0100, Marcos Caceres <w3c@marcosc.com> wrote:

> I'm sorry, but Leonard is not correct: this is the W3C, not ISO.

Yes, so far so good...

> ISO is a real "standards body" (i.e., can be legally binding for  
> governments). W3C is a business/community "consortium" (i.e., not a  
> legal standards body and specs are not legally binding): W3C makes  
> "recommendations", which are not (and should not be) legally binding.

So what? In practice, people make contracts (legally binding documents)
that include such terms as "conformance to...".

>> In ISO specs, undated references are forbidden. There is a team of
>> people (called ITTF) whose job includes checking these things and
>> bugging spec editors to fix them.
>
> Yes, but this is not ISO. And just because they operate in that manner,  
> it also doesn't mean that ISO is right.

True, but it also doesn't mean they are wrong.

>> There is such a thing as certification. It is impossible to do if the
>> spec is not fixed, including references.
>
> What if there is a bug in the spec? or a test is wrong and it's fixed  
> after someone has claimed compliance?

This happens all the time, and has done for decades.

Wat happens is tha by dating explicit versions, you can deal with a known
state. Something that conforms to the earlier, buggy version, or to the
later better (but almost certainly not bug-free, as you seem to freely
acknowledge) version, have a set of known properties. This is helpful when
you are trying to include them as *parts* of an ecosystem.

>> What you are advocating is entirely counterproductive given the source
>> of the discussion (= a PAG): if the spec has undated references, you
>> cannot make sure it is royaltee-free.
>
> Yes you can: the /latest/ always points to the latest REC. REC is  
> royalty free.

No, that is not guaranteed. Under the current process any substantive
change between last call and Recommendation means, as far as I can tell
   from carefully reading what gets licensed, that the change cannot be
assumed to be covered by the patent license unless it happened to be in
the first public working draft. Even in that situation I can see a legal
argument that the technology is not covered by any license. And of course
*until* Recommendation, there is no license.

>> If the scope of one reference
>> changes, there is a new risk. It is not only a problem of conformance
>> testing.
>
> Not if the /latest/ always points to a REC (or a periodical snapshot  
> where IPR commitments to RF have been made).

If you always ensure you are pointing to a particular version of
reference, over which you guarantee that a commitment has been made, then
you are correct. I haven't heard that condition even suggested (let alone
specified in a practical manner) for any "living standard" before.

>> Your vision of "fluid" standards is completely unmanageable in practice.
>
> Yet, somehow, every browser vendor manages? Seems like an enigma.

Actually, I dispute that *any* browser vendor has achieved this. After
years of effort and refinement, most browsers have pretty good coverage of
CSS 2.1 - precisely because it stopped being a "living standard" and
became a stable target. On the contrary, anyone who claims they conform to
HTML5 is either stupid, since that is a meannigless claim while the
definition of HTML5 is known to be in flux, or thinks we are stupid enough
to believe it. I don't believe any browser makes, nor could make without
being laughed at, such a claim.

What browser vendors do is implement pieces of what they think HTML5 will
be, changing them as the spec changes in an effort to find emergent
patterns from what other browser developers, content producers with big
enough market share to insist on people doing what they want, and hope
(forlornly in many cases) that the developers who relied on what happened
before will actually change their code, so the browser can minimise the
multiple compatibility paths it has to maintain in order to work on the
Web.

By comparison, conformance to XML in its various editions is a
well-defined state. It is true that for confromance to a given version of
XML, some things are uncertain. The uncertainty is sufficiently annoying
that people spend the money to fix the spec and make a new version, but it
is also generally sufficiently bounded that people can build
zillion-dollar businesses on top of the existing stack.

I think it makes a lot of sense to recognise that many pieces of the
platform need to be upgraded periodically until we decide to jettison
them, or they are so deeply encrusted into the infrastructure that
changing them is no longer feasible. But a model that provides fluid
changeover of dependencies actually implies a pretty high cost of
management, and I don't see that it is justified. A model that does it in
a more step-wise fashion is what we already have, and naturally
re-examining the particular conditions and mechanisms for changing the
steps is a sensible thing to do, but while I conclude from your premises
that we should be doing that, it seems to me that you conclude we should
simply abandon the idea of this crabwise march and let it all flow.

cheers

Chaals

-- 
Charles 'chaals' McCathieNevile  Opera Software, Standards Group
       je parle français -- hablo español -- jeg kan litt norsk
http://my.opera.com/chaals       Try Opera: http://www.opera.com
Received on Tuesday, 20 December 2011 17:49:48 GMT

This archive was generated by hypermail 2.3.1 : Tuesday, 26 March 2013 18:49:49 GMT