W3C home > Mailing lists > Public > www-international@w3.org > July to September 2002

Re: Determining if Unicode / appropriate glyphs installed on client

From: Bjoern Hoehrmann <derhoermi@gmx.net>
Date: Tue, 09 Jul 2002 13:21:34 +0200
To: "Mustafa Ali \(UrduWord.com\)" <newsletters@urduword.com>
Cc: <www-international@w3.org>
Message-ID: <os5liu8monj9ih76fs2b1a8eg7iovu67sd@4ax.com>

* Mustafa Ali (UrduWord.com) wrote:
>Is there an efficient method to determine if a certain multilingual content
>will display correctly on a browser (considering localized text encoding,
>language font repository, and text-processing algorithms), regardless of the
>user's accept-language preferences? Help in light of both HTML and XHTML
>appreciated.

For HTTP there is the 'Accept-Charset' Request Header, this should help
in theory to determine whether the Client can handle a given character
encoding. Note that user agent support for this header is rather bad. To
determine the others there is no means in HTTP, maybe you should take a
look at CC/PP (see http://www.w3.org). In the CSS rendering model the
client should choose a font that actually contains the right glyphs in
preference to a font that does not.

>For example, using Arial Unicode MS and IE6, I can easily read websites in
>Arabic, Japanese, and most other languages. However, the browser does not
>(practically cannot) send the entire range of languages it supports.

The Accept-Language header does not inform about client capabilities,
but instead of user preferences.

>Again, is there an efficient method to determine localized text encoding,
>language font respository, and text-processing algorithms supported by the
>client? How does CC/PP handle the issue?

This is a question for www-mobile@w3.org.
Received on Tuesday, 9 July 2002 07:22:00 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 2 June 2009 19:16:59 GMT