- From: Max Froumentin <mf@w3.org>
- Date: Fri, 12 Aug 2005 14:40:58 +0100
- To: Wendy Chisholm <wendy@w3.org>
- Cc: Ivo Gonçalves <ivo_mmm@hotmail.com>, public-comments-wcag20@w3.org, tmichel@w3.org, dsr@w3.org
Wendy Chisholm <wendy@w3.org> writes: > Thank you for the pointer to SALT, I'm not sure about the rest of the WCAG > Working Group, but I was unaware of this project. Does this overlap with > W3C work on voice and multimodal applications or timed text? [1,2, 3] Or > does this fill a gap between these technologies? Hi, A few words on SALT and W3C: SALT was contributed to the Multimodal Interaction Working Group in 2002. In 2001, the XHTML+Voice [2] specification was similarly contributed. It became a chartered [1] work item of the Working Group to develop an authoring language for multimodal application, not just voice, but also other input modalities like digital pens or gestures. In order to carry out that work item, SALT, XHTML+Voice and other multimodal languages are being considered. [1] MMIWG Charter: <http://www.w3.org/2004/03/mmi-charter.html> [2] XHTML+Voice profile <http://www.w3.org/TR/xhtml+voice/> Hoping this help, Max.
Received on Friday, 12 August 2005 13:40:58 UTC