W3C home > Mailing lists > Public > www-html@w3.org > August 2002

Validation (was Re: XHTML 2.0 - no interest in RDF/XML?)

From: Masayasu Ishikawa <mimasa@w3.org>
Date: Sat, 17 Aug 2002 17:58:36 +0900 (JST)
Message-Id: <20020817.175836.74752307.mimasa@w3.org>
To: www-html@w3.org

William F Hammond <hammond@csc.albany.edu> wrote:

> > effort to accommodate RDF/XML.  DTD is just plainly incapable of
> > handling it appropriately.
> Probably, yes, in this context, but I think slightly overstated as a
> general proposition in that either the attribute suite of an element
> or the pcdata content model of an element can be re-modeled using new
> element names.

It's unlikely to happen for RDF/XML, otherwise we would not have had
this years of discussion.  Even if it's technically not impossible,
there's no point to expect that ordinary people can manage such hairy
corner case, and if it's that hard, most people just won't add metadata.
If we want to encourage people to add metadata, it ought to be simple
and must not require deep knowledge of cracks between the technologies.
I believe it's the WG's responsibility to work out a technical solution
and they should not transfer that burden to content providers.

> > Validation is no longer a simple process,
> (Has it ever been simple?  I didn't think it was simple back in 1994
> when I was learning how to set up an early HTML validator.)

It's simple enough to set up an XML processor for validation.  Specialized
(X)HTML validation tool could provide additional guidance, but it's not
required for validation per XML 1.0 rules.

> The user community has several segments including (1) server-side tool
> writers, (2) content providers, (3) client-side tool writers, and
> (4) client-side users.
> We place no required burden on client-side users.


> But now just what is the "complex process" in the context of future
> For a given content format there is a specification, however it is
> defined, and client-side tool writers need to know that specification.
> But in the world of client-side XML, don't processors reject content
> at the first sign of error?

At the first sign of well-formedness error, yes.  But non-validating XML
processors won't throw away invalid documents so long as those are

> So the burden of validation rests on content providers with a derived
> burden on server-side tool writers, more specifically, content generation
> tool writers.
> In this context a rather complex process can take place reliably as
> long as it is broken down into manageable tasks handled by cooperating
> open standard (and I would also say open source) tools.

It is nice to have a good tool, but I don't believe relying on
the availability of smart tool is a good design principle.  We
are designing a markup language for the masses.  We'll see more
chance to have good tools if we strive to keep simple things simple.
Complex things are never going to become popular.

Let's keep XHTML2 simple, and hopefully less stupid.

Masayasu Ishikawa / mimasa@w3.org
W3C - World Wide Web Consortium
Received on Saturday, 17 August 2002 04:58:42 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 15:06:00 UTC