Experimental system for managing multimodality

I am a researcher in Human Computer Interaction, recently arrived at 
INRIA, near Paris, from the University of Maryland.

I met recently with Vincent Quint who advised me to contact you, after I 
showed him a demo of a work done by a PhD student I advice: Pierre 
Dragicevic.

This work could be useful both for experimenting with multimodal 
interaction using a web browser and, I hope, as a good model for 
managing input.

The system is called ICon and is described at the following url: 
http://www.emn.fr/dragicevic/ICon/
ICon is a visual language editor for configuring the multimodal input of 
an interactive application, such as -- but not limited to -- a web 
browser.  Currently, ICon has module that support speech input, multiple 
pointers, some USB input devices and the X input extension.  Designing a 
new module is quite simple.

There are a couple of papers there, the most complete being "Input 
Device Selection and Interaction Configuration with ICON" we published 
at the IHM-HCI 2001 conference on human computer interaction (joint 
conference of the English BCS-HCI group and French AFIHM group).

If you think ICon can contribute to your working group, I would be happy 
to give a short talk about it in a forthcoming meeting, if you think 
this is appropriate.  I am quite interested in participating to the 
multimodal interaction activity at WWW, as an member of INRIA.

Best regards,
-- 
  Jean-Daniel Fekete      Jean-Daniel.Fekete@inria.fr
  INRIA Futurs, LRI               tel: +33 1 69156625
  Bat 490, Université Paris-Sud   fax: +33 1 69156586
  F91405 ORSAY Cedex, France

Received on Saturday, 11 January 2003 06:19:05 UTC