Re: visual and auditory navigation: examples needed

On Sun, 31 Mar 2002, Al Gilman wrote:

  - Dave Bolnik did a multimedia example that did word highlighting in sync
  to the speech that he showed at CSUN a couple years back.  This however
  is a canned presentation mechanically following a programmed timeline.
  Not an on-the-fly derived timeline based on interaction with a user.
  This was done with SAMI.  If I understand the concept of the SMART
  technology proposition, it is to create an industry-consensus SAMI
  workalike, in rough terms.

websound, braillesurf both do something along these lines already, working
automatically. I think EIAD might too.

Received on Sunday, 31 March 2002 18:13:56 UTC