- From: Yosuke Funahashi <yfuna@tomo-digi.co.jp>
- Date: Wed, 16 Feb 2011 06:49:02 +0900
- To: Deborah Dahl <dahl@conversational-technologies.com>
- Cc: public-web-and-tv@w3.org
Hi Debbie, Thank you for kindly introducing us MMI information. I've heard that there was also an active discussion on the synergy of MMI and DLNA to achieve the better integration of multiple devices including second screen scenario during TPAC. And we have to clarify the dependency and liaison between MMI and 'Web and TV' based on the use-cases and requirements related to Web and TV as described in the charter [1]. This IG is still in early stage and co-chairs are now working hard to prepare the tools for effective discussion. In the meantime I'd like to try to encourage participants to understand MMI and think about the use-cases related to both activities. Regards, Yosuke Funahashi Web and TV Interest Group co-Chair [1] http://www.w3.org/2010/09/webTVIGcharter.html On Feb 16, 2011, at 4:15 AM, Deborah Dahl wrote: > I thought that some of you might be interested in the recent publication of > the "Multimodal Architecture and Interfaces" specification > (http://www.w3.org/TR/mmi-arch/), which was published by the Multimodal > Interaction Working Group a few weeks ago. Kazuyuki may have mentioned it at > the workshop. I think it's very relevant to anything having to do with the > TV and different ways of interacting with it, such as voice or gesture-based > interaction as well as biometrics. > > The specification describes a loosely coupled architecture for multimodal > user interfaces, which allows for co-resident and distributed > implementations, and focuses on the role of markup and scripting, and the > use of well-defined interfaces between its constituents. It takes a big step > toward making multimodal components interoperable by specifying a common > means of communication between different modalities. It also very naturally > supports distributed applications, where different types of modalities are > processed in different places, whether on one or more local devices, in the > cloud, or on other servers. Finally, we believe it will also will provide a > good basis for a style of interaction called 'nomadic interfaces,' where the > user interface can move from device to device as the user moves around. > > We're very interested in getting comments on the specification. Comments > can be sent to the MMI public mailing list at www-multimodal@w3.org. > > Best regards, > Debbie Dahl > Multimodal Interaction Working Group Chair > >
Received on Tuesday, 15 February 2011 21:50:48 UTC