W3C home > Mailing lists > Public > www-html@w3.org > July 2006

Re: XHTML Applications and XML Processors [was Re: xhtml 2.0 noscript]

From: Bjoern Hoehrmann <derhoermi@gmx.net>
Date: Sat, 29 Jul 2006 20:48:02 +0200
To: mark.birbeck@x-port.net
Cc: www-html@w3.org
Message-ID: <bhanc2psc028qflgvmqruh7b8n8j8701lt@hive.bjoern.hoehrmann.de>

* Mark Birbeck wrote:
>I was using the term 'load' in a general sense--I don't think I talked
>about the onload event as such. To be more precise I'd see the process
>*conceptually* as follows:
>* the XML processor passes a complete DOM to the XHTML processor;

XML processors do not know about "DOM", they just report information
that a higher level application can use to provide objects and their
interfaces to some yet higher level application.

>* the XHTML processor starts using this DOM:
>   * attaching event handlers from XML Events;
>   * running any inline scripts;
>   * loading any images and other resources;
>   * and so on;
>* the 'onload' event is then dispatched, right at the end.

You've picked a processing model and want the world to adapt it. That
won't work, you have to negotiate a processing model with the world.

>Of course it's easy enough to implement such a thing, but the point is
>that by doing so we've made the behaviour of a SAX-based and a
>non--SAX-based processor effectively the same. In short, I think we'd
>need to define things in this 'conceptual' way so that we get
>interoperability, even if implementers add optimisations.

Users want web applications to respond to their actions before they
have been fully loaded. So you need to run scripts before the document
has been fully loaded. And they will change the document. And it will
depend on how much of the document has been loaded how these scripts
behave. Which implies that you cannot make these two modles behave in
exactly the same way. And therefore your model is not applicable in the
general case. Responsive web applications are more important than the
"interoperability" in edge cases that you desire.

>Let's put it the other way round; if I were to implement an XHTML
>viewer by first loading the XML into a DOM and validating it, before
>passing it through to a renderer, what is wrong with that? The problem
>is that if we allow a processor to *choose* to run some of the script
>before the document is completely available (conceptually), then we
>rule out this 'naive' approach.

There is no ruling out. They will not behave in the same way, just
like if you implement progressive processing, you might do it with
a 4k buffer, or a 16k buffer, or a 2MB buffer, in each case you will
be able to construct cases where the behavior is different. This is
how it works in most HTML user agents today, few people have any
problem with that.
Björn Höhrmann · mailto:bjoern@hoehrmann.de · http://bjoern.hoehrmann.de
Weinh. Str. 22 · Telefon: +49(0)621/4309674 · http://www.bjoernsworld.de
68309 Mannheim · PGP Pub. KeyID: 0xA4357E78 · http://www.websitedev.de/ 
Received on Saturday, 29 July 2006 18:48:19 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 15:06:13 UTC