- From: Deborah Dahl <dahl@conversational-technologies.com>
- Date: Thu, 9 Jul 2015 10:54:01 -0400
- To: <www-multimodal@w3.org>
- Message-ID: <02b801d0ba57$184dd720$48e98560$@conversational-technologies.com>
I am pleased to announce that the W3C Multimodal Interaction Working
Group [1] has been rechartered to continue its work through 2016. As we
interact with technology through more and more diverse devices, the
critical need for standards for multimodal interaction becomes
increasingly clear.
Mobile devices, wearables, smart homes, ambient devices and devices in
the Web of Things all place their own new requirements on interaction
capabilities. We can no longer rely on the traditional modes of
keyboard, screen and pointing, but must consider other modalities,
such as speech, natural language, handwriting, gesture and camera, to
name only a few. Existing standards developed by the Multimodal
Interaction Working Group, such as the Multimodal Architecture and
Interfaces specification [2] and EMMA (Extensible Multimodal
Annotation) [3] support the integration of an open-ended set of
current and future modalities that will be key to future forms of
human-computer interaction.
The major work items to be undertaken in this charter period include:
(1) a new version of EMMA, EMMA 2.0. EMMA 2.0 will include several new
major features, for example:
a. handling output from the system to the user, including spoken and
typed output as well as other system actions
b. support for a broader range of formats, for example, JSON and RDF
c. the ability to represent incremental results such as intermediate
speech and handwriting recognition results
(2)Work on Discovery and Registration of Modality Components [4] within
the MMI Architecture. This work will support the kinds of
dynamically configurable systems that are required for mobile users.
In addition, the group will maintain the InkML [5], MMI
Architecture [2], and EmotionML [6]specifications as needed by their user
communities.
An important change in this charter period is that the group will
conduct its technical discussions in public. The group will be using
this list to discuss technical issues that arise during the
development of its specifications. By working in public, we hope to
promote understanding of the specifications and encourage discussion.
For more information, including information about joining the group,
please see the new charter [7].
[1] MMI WG home page: http://www.w3.org/2002/mmi/
[2] MMI Architecture and Interfaces: http://www.w3.org/TR/mmi-arch/
[3] EMMA (Extensible Multimodal Annotation): http://www.w3.org/TR/emma/
[4] Discovery and Registration: http://www.w3.org/TR/mmi-mc-discovery/
[5] InkML: http://www.w3.org/TR/InkML/
[6] EmotionML: http://www.w3.org/TR/emotionml/
[7] Charter: http://www.w3.org/2013/10/mmi-charter
Best regards,
Deborah Dahl, MMI Working Group Chair, W3C Invited Expert
Received on Thursday, 9 July 2015 14:54:20 UTC