W3C home > Mailing lists > Public > public-html@w3.org > October 2009

Re: ISSUE-41/ACTION-97 decentralized-extensibility

From: Jonas Sicking <jonas@sicking.cc>
Date: Thu, 15 Oct 2009 15:49:27 -0700
Message-ID: <63df84f0910151549h1890bc10mbf4a02a4018431cb@mail.gmail.com>
To: Tony Ross <tross@microsoft.com>, Adrian Bateman <adrianba@microsoft.com>
Cc: "public-html@w3.org" <public-html@w3.org>
On Tue, Oct 13, 2009 at 9:57 PM, Tony Ross <tross@microsoft.com> wrote:
> Using Namespaces:
> Beyond allowing extended markup to be valid within HTML documents, a couple of other motivations contribute to the desire to utilize namespaces as a solution.
> The first of these is greater consistency with XML-based documents. Ideally the experience here would be as close to that experienced in XHTML as possible. This is particularly relevant with the introduction of SVG and MathML into HTML  since we can fully expect content to be directly pasted in from these document types. Without namespaces, pieces of content that aren't native to SVG or MathML won't behave as expected when accessed from script.

I will note that HTML has been wildly more successful than XML when it
comes to web pages, so following XML isn't obviously the right thing
to do.

> The second motivation is to allow developers to quickly target groups of related extensions without introducing a host of new APIs. Thus a developer can now use getElementsByTagNameNS or CSS namespace selectors to target large swaths of extended content. This ties in even further with the first motivation since this matches the experience a developer would have in XHTML.

I would actually say that this this is even more true for a solution
like prefixed-based naming, like <example_com_myelement>. I.e. using
getElementsByTagName and namespace-less selectors is even more
familiar to developers than their namespaced counterparts.

> Compatibility:
> Many have expressed the opinion that the proposal as stated may or will break the web. I agree that this outcome is a possibility. Rather than rejecting the proposal outright, however, I would prefer to discuss how it can be tweaked to reduce such risk. One possible approach I can see is to scale back the base proposal to be even more like what IE does today.
> I would also like to come to some consensus on what the tolerance for breakage is. One page? 100 pages? 10,000 pages? Of the billions of pages on the web, certainly any change will break some of them.

I think it's very hard to put an exact number on this. Or indeed to
measure an exact number. I do definitely agree that some breakage is
acceptable though. I'm personally often in the camp that thinks that
breakage is more acceptable than others. My strategy is generally to
try to deploy a desired change in alpha and beta releases, and see if
people complain.

My experience has actually been that Microsoft has been more
conservative here, though I'm very happy if that is not the case (or
no longer is the case). Is microsoft ok with this breakage for the
default "compatibility mode"? Or only for example if the document uses
the doctype specified by HTML5, i.e. <!DOCTYPE html>?

/ Jonas

PS. if you are ok with breakage, can we all please remove the ability
to set document.domain as well as the WindowProxy gunk? ;) (yes, i'm
unfortunately mostly kidding)
Received on Thursday, 15 October 2009 22:50:23 UTC

This archive was generated by hypermail 2.4.0 : Saturday, 9 October 2021 18:45:00 UTC