- From: Kazuyuki Ashimura <ashimura@w3.org>
- Date: Fri, 19 Apr 2013 05:31:16 +0900
- To: www-multimodal@w3.org
Hi www-multimodal, W3C is pleased to announce the advancement of Emotion Markup Language (EmotionML) 1.0 to Proposed Recommendation on April 16. As the web is becoming ubiquitous, interactive, and multimodal, technology needs to deal increasingly with human factors, including emotions. This specification aims to strike a balance between practical applicability and scientific well-foundedness. The language is conceived as a "plug-in" language suitable for use in three different areas: (1) manual annotation of data; (2) automatic recognition of emotion-related states from user behavior; and (3) generation of emotion-related system behavior. The specification is now available as follows. This version: http://www.w3.org/TR/2013/PR-emotionml-20130416/ Latest version: http://www.w3.org/TR/emotionml/ Previous version: http://www.w3.org/TR/2012/CR-emotionml-20120510/ Comments are welcome and to be sent to <www-multimodal@w3.org> through 14 May. Learn more about the Multimodal Interaction Activity by visiting the group's public page at: http://www.w3.org/2002/mmi/ Thank you, Kazuyuki Ashimura for the W3C Multimodal Interaction Working Group Chair -- Kaz Ashimura, W3C Staff Contact for Web&TV, MMI and Voice Tel: +81 466 49 1170
Received on Thursday, 18 April 2013 20:31:50 UTC