W3C home > Mailing lists > Public > www-archive@w3.org > March 2002

Re: visual and auditory navigation: examples needed

From: Charles McCathieNevile <charles@w3.org>
Date: Sun, 31 Mar 2002 18:13:55 -0500 (EST)
To: Al Gilman <asgilman@iamdigex.net>
cc: jonathan chetwynd <j.chetwynd@btinternet.com>, <www-archive@w3.org>
Message-ID: <Pine.LNX.4.30.0203311812070.12271-100000@tux.w3.org>
On Sun, 31 Mar 2002, Al Gilman wrote:

  - Dave Bolnik did a multimedia example that did word highlighting in sync
  to the speech that he showed at CSUN a couple years back.  This however
  is a canned presentation mechanically following a programmed timeline.
  Not an on-the-fly derived timeline based on interaction with a user.
  This was done with SAMI.  If I understand the concept of the SMART
  technology proposition, it is to create an industry-consensus SAMI
  workalike, in rough terms.

websound, braillesurf both do something along these lines already, working
automatically. I think EIAD might too.
Received on Sunday, 31 March 2002 18:13:56 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Wednesday, 7 November 2012 14:17:16 GMT