- From: Christian Smith <csmith@barebones.com>
- Date: Thu, 29 Jun 2000 04:30:14 -0400
- To: www-html@w3.org
- cc: Chris Croome <chris@webarchitects.co.uk>
On Wednesday, June 28, 2000 at 14:43, chris@webarchitects.co.uk (Chris Croome) wrote: > I'm still confused by all of this... > > As an experiment I have set up this page: > > http://c.croome.net/ > > to be served as UTF-8 and the same page at this address: > > http://chris.croome.net/ > > to be served as ISO-8859-1 > > Both are valid XHTML according to the WDG and W3 validators, <span class="broken_record"> While the WDG validator can and does validate XHTML, the W3C validator currently only does a well-formedness check. </span> > however both are invalid on this XML validator: > > http://www.stg.brown.edu/service/xmlvalid/ I think there is a problem with this validator. It doesn't seem to like the XHTML 1.0 DTD. > I have tested the pages in Netscape 3, 4 and 6 in X11 and everything is > fine (apart from the fact that in NN4 you get different fonts with the > different encodings!) and they are also OK in Lynx and in MSIE4 and > MSIE5 on windoze... > > Is there any drawback to using UTF-8 for some browsers/platforms? Certainly. Some browsers may not support UTF-8 (but all modern ones do). > And why isn't this page valid XML? It is as near as I can tell. -- Christian Smith | csmith@barebones.com | http://web.barebones.com PGP Fingerprint - 60E5 2216 97D2 1D1A B923 F036 00A9 CEC0 D411 FA89
Received on Thursday, 29 June 2000 04:30:08 UTC