Is W3C Technology Fragmented or Unified?

Hi,

Can someone please explain to me, briefly and concisely, the W3C vision of a
unified web... if there is such a thing?  I am specifically approaching this
from a developers point of view trying to implement WAI/W3C Standards best
of practice.

Why I ask this question is that I would expect that the W3C has an aim or
charter to not fragment the web.  But I do not understand how we can build
systems to comply with the likes of building content management systems that
comply with the ability to deliver content in

HTML (and all the variants... XHTML-Print, Basic and all the rest of them)
XML
VoiceXML
RDF/OWL
etc
etc

I just want to know how the W3C sees how developers should address the need
to be able to deliver content according to all these requirements.  As far
as I can see, only solutions like Apache/Cocoon offer any hope of being able
to address these issues, or anything else that is generating content from an
XML base through transformers to negotiate content with various target
devices.

Is the W3C fragmenting the web with all these technologies or are we missing
something?  Why can't best of practice XHTML/CSS/WAI address most of these
issues?  Someone please help me out here because I feel things are getting
out of hand and becoming unmanageable if the technology evolves without good
sense and purpose.  There is also a sense of the wheel being reinvented time
and time again.

One of the big mistakes I see in many forms of software architecture is that
often we develop a new type of interface, protocol or API when key
components of the core architecture should have been addressed.  It leads to
fragmentation.

Regards
Geoff Deering

Received on Tuesday, 24 August 2004 21:37:24 UTC