- From: Alexandre Denis <alexandre.denis@loria.fr>
- Date: Fri, 19 Apr 2013 12:49:12 +0200
- To: www-multimodal@w3.org, ashimura@w3.org
Hi, thanks for this notice. I have already sent few comments in a previous message (2013-04-18). Moreover I found another problem while verifying all assertions. The vocabularies document that is accessible on http://www.w3.org/TR/emotion-voc/xml does not comply with EmotionML format since the <emotionml> does not have a version attribute (assertions 110 and 111 fail) hence all the examples found in the specification that refer to this document fail the validation, best regards, Alexandre Le 18/04/13 22:31, Kazuyuki Ashimura a écrit : > Hi www-multimodal, > > W3C is pleased to announce the advancement of Emotion Markup Language > (EmotionML) 1.0 to Proposed Recommendation on April 16. > > As the web is becoming ubiquitous, interactive, and multimodal, > technology needs to deal increasingly with human factors, including > emotions. This specification aims to strike a balance between > practical applicability and scientific well-foundedness. The language > is conceived as a "plug-in" language suitable for use in three > different areas: (1) manual annotation of data; (2) automatic > recognition of emotion-related states from user behavior; and (3) > generation of emotion-related system behavior. > > The specification is now available as follows. > > This version: > http://www.w3.org/TR/2013/PR-emotionml-20130416/ > > Latest version: > http://www.w3.org/TR/emotionml/ > > Previous version: > http://www.w3.org/TR/2012/CR-emotionml-20120510/ > > Comments are welcome and to be sent to <www-multimodal@w3.org> through > 14 May. > > Learn more about the Multimodal Interaction Activity by visiting the > group's public page at: > http://www.w3.org/2002/mmi/ > > Thank you, > > Kazuyuki Ashimura > for the W3C Multimodal Interaction Working Group Chair >
Received on Friday, 19 April 2013 10:48:05 UTC