- From: Alex Rousskov <rousskov@measurement-factory.com>
- Date: Mon, 13 Jan 2003 16:32:04 -0700 (MST)
- To: W3C Evangelist <public-evangelist@w3.org>
On 13 Jan 2003, Chris Hubick wrote: > On Mon, 2003-01-13 at 15:40, Alex Rousskov wrote: > > Indeed, W3C is having a hard time delivering on its promise of a > > "future proof" markup > > I think they are doing just fine. All the existing documents that > have been marked up semantically can be easily/automatically > translated to newer standards (either by rewriting the document > itself, or within the user agent) With documents using non-semantic > markup, font tags, etc, it would be vastly more difficult to > understand the meaning of these in terms of "new" semantics. You are making an implicit assumption that a "document" is something static that can be easily modified. I bet that most moderately-complex sites today are not implemented using static files but are generated using scripts/programs. Thus, in most interesting cases we are talking about modifying _programs_ (server- or client-side), not "documents". This kind of modification is not easy to automate at all and "tools" will be of little help. You are also probably making an implicit assumption that the majority of existing documents have been marked up semantically (otherwise, it would make little sense to justify W3C actions by talking about these documents). I doubt that is true. Based on my personal experience, it takes a lot of resources to keep up-to-date with W3C, and it should not be this way if W3C cares about small, resource-limited content "publishers". What does "future proof" stand for? IMO, it means that if I use _any_ standard published by W3C, I should not be pummeled for not switching to newer standards. I may miss on some cool new features, of course, but I should not receive complaints that my site is not "standard" or not "compliant". So far, the discussions on this list and other public comments of Web authors make me think that W3C markup is not "future proof". Moreover, there is no sign of the markup becoming more future proof with every new standard published (see Mark Pilgrim comments, for example). > ... In the long term I think we will have a cleaner and more rich > base of documents to work with, and will look back and be glad we > made the transition now. I agree that we should try to look 10-25 years ahead. Unfortunately, I do not see a _transition_ path. Most changes require a _switch_ instead of a smooth transition (unless you are willing to support N standards at once, which is even more overhead for small guys). This factor alone can jeopardize our best intentions and create a gap that 25 years of development will not cure. Think Cobol and Fortran, for example. We should not just imagine how things should work and then write/enforce a standard describing our imaginary ideal! For example, most people would agree that communism is great in its final "everybody is free and happy" stage, but we all know what happens if you try to jump into that stage without a proper transition. The question is whether there is a transition path that ends in Markup Heaven, and what that path is. Thanks, Alex. -- | HTTP performance - Web Polygraph benchmark www.measurement-factory.com | HTTP compliance+ - Co-Advisor test suite | all of the above - PolyBox appliance
Received on Monday, 13 January 2003 18:32:06 UTC