- From: Chris Croome <chris@webarchitects.co.uk>
- Date: Wed, 28 Jun 2000 14:43:47 -0400 (EDT)
- To: Dave J Woolley <DJW@bts.co.uk>
- Cc: www-html@w3.org
- Message-ID: <20000628194358.B12375@webarchitects.co.uk>
Hi On Wed 28-Jun-2000 at 10:59:56AM +0100, Dave J Woolley wrote: > > From: Ian Graham [SMTP:igraham@smaug.java.utoronto.ca] > > > > <meta http-equiv="content-type" content="text/html; charset=...." > > > > [DJW:] That's a hack that was legitimised by the > standards after the fact; one should actually specify it > in the real HTTP header, which always takes precedence. I'm still confused by all of this... As an experiment I have set up this page: http://c.croome.net/ to be served as UTF-8 and the same page at this address: http://chris.croome.net/ to be served as ISO-8859-1 Both are valid XHTML according to the WDG and W3 validators, however both are invalid on this XML validator: http://www.stg.brown.edu/service/xmlvalid/ I have tested the pages in Netscape 3, 4 and 6 in X11 and everything is fine (apart from the fact that in NN4 you get different fonts with the different encodings!) and they are also OK in Lynx and in MSIE4 and MSIE5 on windoze... Is there any drawback to using UTF-8 for some browsers/platforms? And why isn't this page valid XML? Any ideas anyone? Chris -- Chris Croome <chris@webarchitects.co.uk> http://www.webarchitects.co.uk/ http://chris.croome.net/
Received on Thursday, 29 June 2000 02:01:54 UTC