RE: Andrew Layman and Don Box Analysis of XML Optimization Techni ques

From: www-tag-request@w3.org [mailto:www-tag-request@w3.org]On Behalf Of
noah_mendelsohn@us.ibm.com

Len Bullard wrote:

>> Is this a disruptive technology?  Likely yes.
>> Incumbents never like those and do whatever they
>> can from proposing overloaded requirements, humor
>> pieces designed by showmen to muddy discussions,
>> FUD, whatever to slow those down until they have
>> their own products ready.

>Well, I think we should tone down this whole discussion.  As I am partly 
>responsible for heating it up a bit, I apologize for that.  I do think the 
>above somewhat oversimplifies the reasons for my concern.

Possibly.  The humor pieces don't exactly give me comfort the concerned 
parties take the customers seriously, but I'm guilty of that sort of 
thing too.  But we're on the company clock here, so some somber tone 
is needed. :-)

>I suspect you meant "disruptive" in the Clayton Christensen sense [1] of a 
>technology that percolates up from the bottom, is initially not fully 
>robust or general, but is compelling for certain applications.   Such 
>innovations often mature to overwhelm established technologies and along 
>with them, established business models of "incumbents".  Fair enough.

That's a good definition although I am unsure as to what the level of 
maturity of the FI is except that it has been developed, benchmarked, 
and is on the track for ISO standardization.  So it may have already 
passed those signposts above.

>My concern is that Binary XML is disruptive in another less positive 
>sense.  Part of the value of XML is its nearly universal interoperability. 

Yes. As an ancient markup wonk, I can see that.  On the other hand, as 
Jonathan Schwarz points out in his blog and at the innovation summit, 
the tension between interoperability and innovation is well-understood. 
We have periods of interoperability punctured by innovations that take 
some time to be absorbed to regain the interoperability peak. This is  
part of the lifecycle of commoditization that includes transparent 
pricing, product value differentiation, fragmented products that force 
the customer to become an IT department, simplified interfaces for 
complex systems, perpetual ubiquitous demand, network effects that 
tip markets to lowest common denominators, and customer interactions 
that drive those network effects.

Interoperability isn't a state; it's a phase. It's good but balanced 
against an active ecosystem of purposes and solutions.

>XML data can be repurposed over and over again, sometimes for uses not 
>originally anticipated.  

If the binary is lossless, it is the information, the data, that is 
repurposed.  The binary is an encoding of that data.  The reuse of 
XML as defined is in the tools we use to reuse the data.  XML is 
reusable.  YMMV for reusing the information within it.  We don't 
know how to measure that in all dimensions simultaneously (subject 
to quantum effects in the general Heisenbergian sense).

>While in principle one could re-release all the software 
>that's already out there to include new drivers for binary XML, in 
>practice there will for years be software that only understands the text 
>form.  Even if binary is successful, we will bear for the indefinite 
>future the cost of conversion between the two, e.g. when editing in Emacs 
>is desired.  So, there is a downside.

Yes.  That is the market at work.  We bear that cost for every innovation. 
We would bear that cost and possibly worse by taking TimBL's suggestion 
to simplify XML.  No change means no cost but also no benefit.

>I'm not personally against Binary XML.  I do think the disruptions in the 
>2nd sense are sufficiently troublesome to those who benefit from XML today 
>that we should set the bar fairly high in justifying Binary XML. 

Let the customer decide what they can afford.  The race is to the swift. 
If they don't need it, it's just another 'dead parrot'.

>I'm just not yet convinced 
>that we know which use cases we can really address, and what factors in 
>space or time to expect in return for the investment.  I'd like to know 
>that before signing on.

We tried that kind of thinking when HTML and RSS first appeared.  The 
market handed our heads to us for the privilege.  It isn't an issue of 
whether one exists or will be sold.  That's a done deal.  It is now as 
I said, issues of choosing to create a W3C standard and what goes in 
it.  As Norm points out, ISO has moved on this and perhaps that is 
sufficient for getting sanctioned paper, but it also means that those 
customers who's procurement guidance is to only work with W3C specs 
have to write waivers and will as needed.  So if there is additional 
benchmarking needed, Sun has made what one needs to start that available 
as a contribution.  Applaud that and use it.

>And, FWIW, I think that any time we have an opportunity to introduce the 
>Navier-Stokes equations into the design of XML, we should leap at the 
>chance. 

I'm personally hoping to use VonNeumann and Brickhoff's quantum logic 
(originally, math by Hermann Grassman) to implement the mustUnderstand 
attribute.  It seems to me the math designed for extensibility 
(ausdehnungslehre) should work quite well for the Extensible Markup
Language.

len

Received on Thursday, 7 April 2005 17:46:40 UTC