W3C home > Mailing lists > Public > w3c-wai-gl@w3.org > January to March 2002

Re: visual and auditory navigation: examples needed

From: Charles McCathieNevile <charles@w3.org>
Date: Sun, 31 Mar 2002 18:45:15 -0500 (EST)
To: Al Gilman <asgilman@iamdigex.net>
cc: WAI GL <w3c-wai-gl@w3.org>
Message-ID: <Pine.LNX.4.30.0203311838220.12271-100000@tux.w3.org>
See http://www.mencap.org.uk for a working example, although one where the
user needs to click on a button to make a section speak. But applying this to
SVG animation or HTML event triggers should be reasonably straightforward
(although because HTML's event triggering system is messy in terms of
accessibility it won't be beautiful...) I will try to find time next week to
write a code example.

One of the issues here is whether this should be done by the author or the
browser - people who have talking browsers will have interference problems.
It seems to me a smarter solution in the general case for someone to be given
a browser like websound or homepage reader.



On Sun, 31 Mar 2002, Al Gilman wrote:

  At 08:59 AM 2002-03-31 , jonathan chetwynd wrote:
  >read on action is:
  >read on mouse over, read on tab or other action.
  >I'll be looking into all this over the coming months, but really believe
  >some good examples are needed,  as I'm very unclear about what is possible.
  >Why this is SO important is that many users have multiple impairments.



  For a long, digressive response.



Charles McCathieNevile    http://www.w3.org/People/Charles  phone: +61 409 134 136
W3C Web Accessibility Initiative     http://www.w3.org/WAI  fax: +33 4 92 38 78 22
Location: 21 Mitchell street FOOTSCRAY Vic 3011, Australia
(or W3C INRIA, Route des Lucioles, BP 93, 06902 Sophia Antipolis Cedex, France)
Received on Sunday, 31 March 2002 18:45:16 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 16 January 2018 15:33:40 UTC