orthogonality is not the answer.

During the Plenary Day proceedings on Wednesday, Mark Birbeck alluded to "late binding."  Defer the binding to concrete details if the processors downstream have the capability to bind what you leave unbound.

This captures in a nutshell most of accessibility concerns.  "Don't eliminate the possibility of client-side adaptation of the user experience."

It is also a far more accurate model for what can be done with Web media than "separation of content and presentation."  Once you have made the presentation orthogonal to the content, the information has been lost.  That's basic information theory.  So a progressive-refinement or progressive-concretion model is appropriate and an orthogonal-subspaces model is not.

Back when the Web was started and the name of the game was consolidating resources from Gopher, FTP, etc, 'orthogonality' and 'unicity of the URI
namespace' were suitable battle cries.

Nowadays the challenge is integrating multi-tier service-delivery architectures, not capturing the Gopher market share for HTTP.

We need to stop talking about 'orthogonality' which is not true of web operations at present and is not "within the capture radius" -- readily achievable -- for web operations either.

We should in our architecture articulate constraints on practice that are feasible and effective.  'Orthogonality' is not going to be effective because it is not going to be fact.

The history of the last ten years is that the ideas in this document no longer described Web practice and the Web grew explosively anyway.  We need a better theory.  

We also need a better theory to avoid the kind of continuity-of-operation trainwrecks such as the one that Noah laid out in his Lightning Talk.  These trainwrecks have been all too frequent over the last ten years.  It's time we started doing something more effective about this situation.

The kind of approaches that David Orchard alluded too: formally tracking the degree and nature of change between revisions of specifications -- could be used to maintain much more continuity of interoperability in the face of change.

I don't work closely enough with XSD to specifically say Henry was wrong, but I wouldn't say his remarks about "damn the migration, define a schema language" should be taken without question, either.

The main point is that there are good classical models such as the dominated convergence theorem in Real Analysis that show how dominance relations (the math class to which David Orchard's "measured partial conformance" belongs) can be built into a system sustaining continuity of operation in the presence of change and local optimizations.

This kind of systematic organization based on partial orderings can describe both specification migration and progressive concretion of the web experience.  We haven't done the astronomy-to-microscopy layers of binding to show how this applies in the Web case.  But at least the game is worth the candle.

Orthogonal decomposition is simply not going to be available in the domain of web media.  Human perceptual response is too gestalt.  The optimization factors for web media couple the markup aspects without fail.  The progressive concretization of the web experience is too critical a piece of they web architecture of tomorrow for us to sustain the illusion that the web architecture can be built on a Gram-Schmidt decomposition of the Web domain.  Forget about it.  Let's look at what actually describes where we can get -- as represented by the excellent work of XForms, MMI, DI et_al as displayed this week, and frame an architectural document that has a prayer of describing the problem domain.

Al

Received on Saturday, 6 March 2004 04:30:41 UTC