- From: Al Gilman <Alfred.S.Gilman@IEEE.org>
- Date: Fri, 12 Nov 2004 11:58:22 -0500
- To: "David Lieberman AWDSF" <david@awdsf.com>, <www-html@w3.org>
At 8:29 AM -0800 11/12/04, David Lieberman AWDSF wrote: >I'm trying to make sure that my sites will be backwards compliant >for old browsers if I use XHTML 2.. >From what I understand, I can use XSLT to produce a re-worked xhtml1 >rendering of the pages I'm using. Then I can use browser sniffing >or http-accept, to make sure that older browsers get my normal XHTML >and XHTML 2 complaint browsers are served the newer XHTML 2 version >of my code. >Would this work? It could work. You would probably have to follow some houseStyle rules in your source so it transforms OK. You might actually do better picking up some intentionally cross-device authoring system such as those made by the members of the Device Independence Working Group, and using an XML content repository that incorporates elements of DISelect [1] as well as XHTML 2.0 in your source form. Then write your targeted export writers to XHTML 2.0 and down-levels as you wish. But they cost less and less as you invest in the quality of the source form. But that's not free. Al [1] http://www.w3.org/TR/cselection/ >Would it even be worth my time? That's a question only you can answer. There is non-zero gain, non-zero pain, and a feasible path to invest some pain in a way that will yield some gain. Whether it's worth it has to do with what you value. Al >Thanks, >David Lieberman ><http://www.awdsf.com/>http://www.awdsf.com
Received on Friday, 12 November 2004 19:30:35 UTC