- From: Peter O.B. Mikes <pom@llnl.gov>
- Date: Tue, 17 Dec 1996 13:21:31 -0800
- To: "Martin J. Duerst" <mduerst@ifi.unizh.ch>
- Cc: Klaus Weide <kweide@tezcat.com>, Larry Masinter <masinter@parc.xerox.com>, www-international@w3.org, http-wg%cuckoo.hpl.hp.com@hplb.hpl.hp.com
Martin J. Duerst wrote: > > On Mon, 16 Dec 1996, Klaus Weide wrote: > > > > Example of a site where documents are provided in several charsets > > (all for the same language): > > see <URL: http://www.fee.vutbr.cz/htbin/codepage>. > > The list is impressive. It becomes less impressive if you realize > that all (as far as I have checked) the English pages and some > of the Check pages (MS Cyrillic/MS Greek/MS Hebrew,...) are just > plain ASCII, And even less impressive when one finds out that not a single page Displays the 'diacritics' correctly, even when one selects Latin-2 encoding in the Netscape 3. Why is that? Are the experts building an Edsel? > > It is certainly much easier to make a Web clients able to decode UTF-8 > > to locally available character sets, than to upgrade all client > > machines so that they have fonts available to display all of the 10646 > > characters. Besides, character set is not function of a language, math, APL, music, all have different need for character sets and character sets mixtures which can exceed 10646 > > Definitely UTF-8 should be encouraged. But that's not done by > introducing new protocol complications and requiring the servers > to deal with unpredictable transliteration issues that can be > dealt with more easily on the client side. > > Regards, Martin. -- Peter O.B. Mikes pom@llnl.gov http://edprog.llnl.gov/team/pom.html
Received on Wednesday, 18 December 1996 00:41:25 UTC