Re: Spec organizations and prioritization

On Thu, 22 Mar 2012 20:46:10 +0100, Marcos Caceres <w3c@marcosc.com> wrote:

>
>
> On Thursday, 22 March 2012 at 19:15, Jeff Jaffe wrote:
>
>> > My money is still on 2022 :)
>>
>> While you jest; many a truth is said in jest so I need to respond.
>>
>> We as a developer community owe it to our stakeholders to get a stable
>> HTML 5 to them.
>
> Stability of specification is not the issue here. The spec may be rock  
> solid today, but if no one can pass the test suite, or a test suite does  
> not exist, then the spec is not going nowhere on the REC track. Consider  
> also that HTML5 sits on-top two specs under development: WebIDL and  
> DOM4. Those specs will need to progress quickly, and also require test  
> suites, etc… then they too have dependencies, and so on.

I think that's a bit backwards. HTML5 was stabilising and being  
implemented long before the DOM4 spec started to appear. DOM4 and WebIDL  
are handy and helpful, but HTML5 has a voluntary dependency - using them  
because they are better, not because they are absolutely indispensable.

Whether you use numbered versions released step-wise, or a "living  
standard" model where whatever is in there today is the only acceptable  
truth for all eternity (unless the spec changes tomorrow in which case  
*that* is the only ...), if you get something wrong, you can fix it.  
Whether you do that by having everything in a single monster "the web  
platform" spec, or in individual specs for each HTML element and each API,  
or some middle ground, is more a question of convenience - for editors,  
implementors (of all kinds of tools - if CMS developers don't know what  
browsers do and don't think it matters, then the browsers continue to  
reverse-engineer what the CMS does, and we return to the *beginning* of  
HTML5), educators, and content producers.

The W3C Process is meant to enable us to make sensible and effective  
decisions so we can most efficiently choose the compromises needed. Often  
it does. The question here is how we deal with the cases where it doesn't,  
in such a way that we don't end up worse off than we started.

> Let me cite from the WHATWG FAQ:
>
> "For a spec to become a REC today, it requires two 100% complete and  
> fully interoperable implementations, which is proven by each  
> successfully passing literally thousands of test cases (20,000 tests for  
> the whole spec would probably be a conservative estimate).

Actually, while it is generally understood that this is what you should  
have, I don't know where the hard requirement is written.

An idea that has been floated in line with the 'snapshots+living standard"  
is to allow the development of a rec which defines the stable stuff, and  
acknowledges that edge cases are still waiting to be sorted.

Which would give a stable explanation of how to handle paragraphs, lists,  
headings, tables, images, and a bunch of other stuff that is much more  
developed than the HTML 4.01 version, plus a truer picture of where there  
still be dragons, and not markedly interfere with the process of slaying  
them dragons.

...
> Hence, WHATWG HTML, at least, is not concerned with reaching a status.

Right. One of the reasons why so many key stakeholders were so very happy  
when W3C started working on HTML5 - because they *are* concerned with a  
status.

>> HTML is a living technology. So there certainly needs
>> to be continued enhancement which I assume we will call HTML.next or 5.1
>> or 6. But it would be irresponsible not to provide something until 2022.

> It would be more irresponsible to do another HTML4.01 - or to violate  
> the W3C process to meet some arbitrary date.

Hmm. The suggestion I refer to above doesn't, as far as I can tell,  
violate W3C process. And at worst, it can be used to force a publication  
on some arbitrary date (much as it is equally possible to abuse the normal  
process to make it very difficult for working groups to publish - a tactic  
which has been used in practice to then point to them and say "those guys  
are too slow").

I think what you mean is that we should not stop working towards very high  
quality end products - and if so I quite agree.

> We have the Process Document in place for a reason

Yes, it is part of the agreement between the members and the W3C staff on  
how W3C will work. Its goal is to ensure that members, and coincidentally  
other stakeholders since W3C's broad consensus is that it should serve the  
needs of the world not just the people who actually make the organisation  
work, have a clear set of expectations about how W3C makes standards to  
ensure transparency in the decisions on quality, speed, and completeness.

> (and having _at least_ two independent implementations passing tests for
> every feature is what makes W3C RECs of high quality).

Not at all. It is a common indicator, but having a low-quality spec with  
two implementations is easy to achieve, and having a high-quality spec  
without getting two implementations is also quite feasible.

The two implementations convention relies on the idea that if you have two  
*willing* independent implementors, and they independently come up with  
the same result when they sit in a corner with the spec and code it up,  
that's a good sign that it works well.

HTML5 has regularly broken that model by insisting on talking to all the  
implementors sufficiently that they are not really independently coding -  
just building on a different codebase. Nonetheless, there is a lot about  
the definition of HTML5 that is of an extremely high quality.

cheers

-- 
Charles 'chaals' McCathieNevile  Opera Software, Standards Group
     je parle français -- hablo español -- jeg kan litt norsk
http://my.opera.com/chaals       Try Opera: http://www.opera.com

Received on Friday, 23 March 2012 10:18:19 UTC