W3C home > Mailing lists > Public > www-html@w3.org > February 2000

Re: review process [was: identify...]

From: Tim Berners-Lee <timbl@w3.org>
Date: Thu, 17 Feb 2000 22:57:48 -0500
Message-ID: <043201bf79c4$5212d4e0$a60a1712@col.w3.org>
To: "Murray Altheim" <altheim@eng.sun.com>
Cc: <www-html@w3.org>


>> I believe it refers to consensus among whatever community is asked
>> to review a spec; in the case of last call, that's the whole world.
>
>To which I have repeatedly responded, quite correctly:
>> > [...] The W3C does *not* seek public consensus on its
>> > specifications, for better or worse.

Not quite correctly. The w3C does seek as wide a consensus as it can.
The chairs and Advisory Board have frequently thought about how to make
the process more open and still effective, and the process has become more
openly open.

Particilarly at last call is the time when public consensus is sought:
http://www.w3.org/Consortium/Process/Process-19991111/tr.html#last-call

"[...] Associated activities
The Working Group solicits and responds to review and comments from W3C
Working and Coordination Groups and external sources.
 [..]
 External feedback is also encouraged. [...]
 Once the last call period has ended, all issues raised during the last call
period resolved, [...]"

This means that a working group can't go on to CR or PR with an oustanding
problem, whoever introduced it.


>> The W3C does seek consensus for its specifications, including public
input.
>> It does also have ways of ensuring that the process terminates and is not
>> attacked by "denial of service" attacks for example.
>
>Dan has been making the claim that the public has the same right to
>consensus-building as W3C members, which is patently false.

Well, he didn't put it quite like that. But you are right that W3C members
have
earlier access to the specs, and the right for example to advise on how
W3C resources are spent, and the right to review a Proposed Recommendation.

However, at last call, the public has the right to review it the spec.

During CR, anyone who has serioulsy implemented a spec has valuable
information.

The process document isn't a game to see how to restrict the input to a
spec.
It is a game to try to get as much high quality input and still complete.

>The W3C
>is unlike the IETF in that while it may solicit public input (as you
>say) it is under no obligation to form public consensus on issues,
>as I have said, for better or worse.

By explicitly putting that in the process, it has put itself under that
obligation.  This is partly its duty to behave well as a consortium among
many peer consortia.

>Worse in that consensus is only
>among vendor members (although this is comprised of many experts from
>around the world), but better in that the process is much quicker,
>as both you as Director and the W3C WG chairs can simply declare
>'consensus' at whatever point you deem appropriate, regardless of
>true consensus.

There is currently a lot of trust that neither I as director nor any chair
will abuse
the role of judging consensus.  The advisory Board is working on ways to
add appeals to the process so that the judgment of an individual cannot in
the
end alone derail the process.

I hope that you do not feel that consensus has been declared when it has not
in reality been present.

> It's not as egalitarian as the IETF, and pragmatically
>this has both its benefits and its drawbacks.


To make a good comparison betwen the organizations one has to look at
who has the power to make decisions. In the case of the IETF, the power
to overrule a working group is given to the Area Directors. Every
organization
needs some element of executive power with its checks and balances.

>Dan wrote [0121]:
>> As with anybody else, the WG's obligation to me is to
>>       -- convince me to withdraw
>>       -- accept my suggestion, or
>>       -- escalate the issue
>
>Dan is (a) pretending to be a member of the public ('as with anybody
>else') and not a W3C staff member, and (b) implying that any member of
>the public can stand in the way of a specification's forward movement by
>simply making a comment and refusing to budge, or filibustering.

Filibustering can be ruled out. The review process has to be able to
decide on what is a technically valid comment and what is a political one.
To first order that is a judgement matter too, but we have the constraint
that when a minority view has been overruled and not retracted that it must
be
taken to the highest level.

You have accused Dan of trying to simply delay the specification. It is
better to review the content of the comment, which weighs more than its
source.

> He
>claims he must be convinced to withdraw, the WG must accept his position,
>or the issue must be escalated. He forgets the fourth option: the issue
>is discussed within the WG and proper fora within the W3C, and a decision
>is made. The HTML WG has already discussed this issue ad nauseum and
>believes the current status is the best course.


If that option were always open to a working ggroup, then we would have
no consistentcy of architectiure across multiple working groups.
It is necessary that you look at this comment with an open mind.
That is what open review requires.  In fact, in this case, we are
talking about a clash of architectural decisions from different worlds:
FPIs from SGML and URIs from the web.  It is worth discussing.
________________________________________________________

The technical point

>We (a) wish to continue using public identifiers for XHTML DTDs,

Clearly.  This has been SGML practice.  However, looking atthe alternative
is to look at the way the rest of the web architecture has worked.

>and
>(b) believe that since the W3C has not endorsed use of SGML Open catalogs


That is not surprising. A central catalog is the equivalent of the
"/etc/hosts"
file which the Internet used to relate names to IP addresses before DNS was
designed to provide technical support. DNS is now the Internet community's
established distributed system for name lookup. It is that which allows HTTP
names
to be looked up in real time. It has associated with it a large social
system to ensure
its stability which, while a victim of its success and not untroubled, is
much for
advanced and dveleoped than anything else.

>nor provided an alternative indirection mechanism,

Many alternative redirection mechanisms exist. HTTP provides one.
Many many local cache systems allow software to operate on local data
while using its definitive HTTP name.  HTTP provides proxy operation
to allow this, and 3 forms of dredirect in the protocol.  On most web
servers
a one-line configuration file line is all that is needed to establish such
a redirection.

> use of absolute URLs (let's be clear: these are not URNs

That is not clear. In fact the difference between URL and URN is largely
whether you want to praise or insult an identifier's longevity. That is why
I use the term URI to cover all such things.
http://www.w3.org/DesignIssues/NameMyth

The persistence of a name depends on the contractual obligations
and commitment of the organizations involved.

If the group felt that the ICANN-rooted system which currently provides
redirection for two billion web pages is not going to have the
organizational
support to match the SGML open catalog, then the architecturally
correct thing would be to define a new URI prefix called "fpi": and
map the existing FPI syntax onto standard URI syntax. Then the fpi
space woudl be usable for all systems which needed whatever it provided
and other systems did not.  It is a fundamental [point of web architecture
that
the design of the namespace, and of data formats are decoupled. Therfore
while XML brings a legacy need to be able to hold FPIs, to use anything
other than URIs in the long term is architecturally anomlous.

> within DTDs with no allowance for
>modification to local needs is problematic and would cause all XHTML
>documents to attempt to locate the XHTML DTDs on the W3C site,

Not at all.  You have refused to view a URI as a name. Try it. Then realize
that
the local system can be organized to dereferece it according to any system
it cares to use.  However, it is the responsability of that system to ensure
the
integrity of the dereferencing, that when the URI is dereferences you do
indeed
still get the DTD in the appendix B.   (It could of course to this by
copying appendix B.)

> which
>might very well cause a 'denial-of-service' problem,

(To the same extend as the SGML catalog may get worn out be frequent use. :)

>and more importantly
>also disallows the real users of the DTD to use them effectively in local
>environs.

Nothing is disallowed in that sense. Keeping a local copy -- using a URI
 as you would an FPI -- is quite reasonable.

>The HTML WG has rightly decided to use relative URLs for system
>identifiers, and allowed these to be altered to suit local environments.


On the web, this doesn't work, as there is no local environment.  SGML was
not built
for the web, so the assumptions from SGML tools may typcially be those of
the
normal batch processing environment in which a catlog established the
relatinship between
virtual and real datasets before execution. However, a web document stands
on its own: if it needs any environment, it must point to it.

>We reserve the public identifier as the canonical identifier for the DTD,
>as has been done in every other HTML DTD and indeed in most SGML and XML
>DTDs I've ever seen.


That is true. And so the anomany persistes in the web architecture. With
namespaces
and schemas it will disolve: and that may be the way to go. But in the mean
time,
ensuring that for those systems which do work using URIs will work,
dfeining an absoliute definitive URI in eth DOCTYPE is worth considering.

>If there were a completely functional URN system in place plus a W3C-
>endorsed indirection mechanism, I don't think we'd have a problem with
>simply using a URN system identifier. But this is not the case.


I think you will find HTTP to provide both.

>But this seems to be more an issue about process and the scope of
>required consensus, doesn't it?


 I belive at the end of the day we are here to do good technology.
We have talked technology and process. Both are interesting and important.
But the technoogy is primary. That is what we are after.  If we can find a
way to
make a solid simple scalable system for the next 500 years then we have a
duty to do that.


I have argued both of these.  In each case I have agreed with Dan Connolly.
This is not surprising: we work together, and he has been around the web
scene for along time. (actually he was around SGML a long time too). But I
understand
that you and many others are well versed in the needs for FPIs, and I
respect that,
and try to take it into account.

I think that FPIs can probably be hendled in a way which will be a
compromise which will
allow people to be happy on both sides.   I am worried about other similar
problems:
not least the antagonism which some have shown to schemas - modularization
not
invented here.

There is a tradeoff between the effort it takes to work together
and the speed allowed by working alone.  I feel we all need to work together
more here.

Tim

>Murray
>
Received on Thursday, 17 February 2000 22:58:05 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 27 March 2012 18:15:42 GMT