- From: Nick Kew <nick@webthing.com>
- Date: Sun, 4 Apr 2004 22:25:37 +0100 (BST)
- To: Jesper Tverskov <jesper.tverskov@mail.tele.dk>
- Cc: w3c-wai-ig@w3.org
On Sun, 4 Apr 2004, Jesper Tverskov wrote: > I would like to ask the list about potential or already existing problems or challenges for accessibility caused by the use of Unicode. > > Let us take Google as example. It returns search results in many different languages on the same page, and the result page uses Unicode. And would be better if it highlighted different language pages more clearly. And that's speaking as a sighted user. > At the moment change of natural language is not included in the mark-up. Since the user can choose to get results in a particular language only it would probably be possible for Google to indicate change of natural language automatically even when many languages are used in the same page and the page is generated from many different language sources. Google's record on i18n does them no credit. Just look at what's being said on their "community", orkut (which they launched as latin-1, not unicode - scarcely credible for a supposedly-global service today). > It is probably less realistic to expect smaller or ordinary websites and web services to be able to include mark-up for change of natural language when documents are generated on the run from many language sources including interaction with users, like commentary and debate, etc. Why? As a rule, this is a function of awareness, not size. And I expect awareness is probably widespread outside the English-mother-tongue world. > Now consider a modern word processor like MS Word. Even if 10 different languages are used in 10 paragraphs on the same page, the spell checker has no problem identifying the change of natural language and to apply the right dictionary for each paragraph. No indication of change of natural language is needed by the author. Really? My experience differs: it has enough trouble distinguishing English from American. Only about a week ago I had email (from an american) "correcting" my spelling on a webpage. His spellcheck had evidently diagnosed the language incorrectly, in spite of it being indicated in the page's markup. > Maybe it is more realistic in many situations to leave indication of change in natural language to user agents than to expect web page authors to do the job. Be conservative in what you expect of others, liberal in what you accept from them. Everyone should do their best. > Web page authors should probably still indicate change of natural language in web content made by themselves, but it is probably much more convenient and realistic to leave this task to user agents for many types of generated content. Developers of authoring and publishing tools should ensure that marking language changes correctly becomes automatic for authors, and happens when a page is generated dynamically. > The above is just one example of problems or challenges for accessibility arising from or made more common by the use of Unicode. I would like to hear of other cases, and if it is more realistic in many situations to leave detection of change in natural language to user agents. It seems to me you've raised two separate issues: you speak of encoding but advance arguments regarding language. I find that makes it hard to debate clearly. -- Nick Kew Nick's manifesto: http://www.htmlhelp.com/~nick/
Received on Sunday, 4 April 2004 17:26:08 UTC