- From: Charles McCathieNevile <charles@w3.org>
- Date: Wed, 31 May 2000 09:04:17 -0400 (EDT)
- To: pjenkins@us.ibm.com
- cc: w3c-wai-gl@w3.org
Well, one might think so. Unfortunately, the evidence suggest otherwise - with technologies such as JFW and Windows (which are perhaps the single most common combination discussed on this list) "localised" (i.e. non-english) versions often lack important features for months or years. I agree with you that having better quality software is a better solution than requiring authors to go through too many hoops to compensate for what we have. However, it is important to provide access to real people in the real world, and "Until user agents.." (to borrow a phrase) measure up, and authoring tools do too, to the extent that we can rely on people having appropriate technology, there are problems that can be solved by authors (or not, should they so choose). The combination of a 486 machine running linux and screader or speakup is not actually so rare in a community where unemployment or underemployment is the majority experience. Even in the economically successful USA. On the other hand, in Denmark the government pays for Jaws. And versioning lags significantly by comparinson to the english language version. Not only does everybody not want english, but through much of the middle east, africa, and asia, the software is not just unable to produce anything like an appropriate rendering (more so with braille than speech synthesis, since contraction conventions vary, and only a handful of languages use the same set of letters as english) - it is useless for arabic, indic, "CJK" (chinese/japanese/korean), thai, greek, russian, serbian, as well as tonal and many other alphabets. A similar problem arises for tonal languages like vietnamese that use (heavily accented) latin alphabets, but where a different intonation means a different word (ba means grandmother or three, ga means fish or chicken...) And adding this kind of functionality does not come cheap. Internationalising a piece of software requires robust design in the first place, good knowledge of the particular language, and skill. The problem is not in knowing how it should be done, the problem is in actual implementation - rewriting masses of code because it assumed that a single keypress could be used for a single character, or that text would be left to right, or that accented text could be understood without the accents. cheers Charles McCN On Tue, 30 May 2000 pjenkins@us.ibm.com wrote: Charles wrote: >I think Phill identifies the issues here fairly clearly, although I would >note that hte question of what is a reasonable approach for the US might not >work so well in other countries, where the markeyt for and range of assitive >technologies is much more limited. We should bear this in mind if we are >writing guidelines for a world wide web consortium Although the "market for" English versions of assistive technologies may be lower in other countries, [not everyone reads or wants English] most if not all assistive technologies available in the U.S. are also available world wide, just not available in the national language. So where are the guidelines for assistive technology developers to "translate" their offerings? Wouldn't it be more practical to "translate" the assistive technologies than to write guidelines that get added to legislation requiring all pages to be usable without such technologies in the national language? The "making accessible" of existing governments web pages alone in each country could probably pay for the translation costs. Regards, Phill Jenkins, IBM Accessibility Center - Special Needs Systems 11501 Burnet Rd, Austin TX 78758 http://www.ibm.com/able -- Charles McCathieNevile mailto:charles@w3.org phone: +61 (0) 409 134 136 W3C Web Accessibility Initiative http://www.w3.org/WAI Location: I-cubed, 110 Victoria Street, Carlton VIC 3053 Postal: GPO Box 2476V, Melbourne 3001, Australia
Received on Wednesday, 31 May 2000 09:04:19 UTC