W3C home > Mailing lists > Public > www-multimodal@w3.org > April 2013

AW: Emotion Markup Language (EmotionML) 1.0 is a Proposed Recommendation

From: <Felix.Burkhardt@telekom.de>
Date: Mon, 22 Apr 2013 16:22:10 +0200
To: <www-multimodal@w3.org>
CC: <alexandre.denis@loria.fr>
Message-ID: <05C81A773E48DD49B181B04BA21A342A2BACFE7247@HE113484.emea1.cds.t-internal.com>
Hi Alexandre
Thanks for reviewing EmotionML and sending this and the previous mail [1].
We will review your comments in the group, and perhaps find ways to enhance the spec, the vocabularies and/or the schema.
You're certainly right about the erroneous vocabulary document.
Meanwhile, can you give us more information about the implementation you're doing?
It might be a good idea to add you (and other implementers of EmotionML) to our  Implementations page [2].
Kind regards,

[1] http://lists.w3.org/Archives/Public/www-multimodal/2013Apr/0002.html

[2] http://www.w3.org/2002/mmi/implementations.html

-----Ursprüngliche Nachricht-----
Von: Alexandre Denis [mailto:alexandre.denis@loria.fr] 
Gesendet: Freitag, 19. April 2013 12:49
An: www-multimodal@w3.org; ashimura@w3.org
Betreff: Re: Emotion Markup Language (EmotionML) 1.0 is a Proposed Recommendation

thanks for this notice. I have already sent few comments in a previous message (2013-04-18). Moreover I found another problem while verifying all assertions. The vocabularies document that is accessible on http://www.w3.org/TR/emotion-voc/xml does not comply with EmotionML format since the <emotionml> does not have a version attribute (assertions 110 and 111 fail) hence all the examples found in the specification that refer to this document fail the validation, best regards, Alexandre

Le 18/04/13 22:31, Kazuyuki Ashimura a écrit :
> Hi www-multimodal,
> W3C is pleased to announce the advancement of Emotion Markup Language
> (EmotionML) 1.0 to Proposed Recommendation on April 16.
> As the web is becoming ubiquitous, interactive, and multimodal, 
> technology needs to deal increasingly with human factors, including 
> emotions.  This specification aims to strike a balance between 
> practical applicability and scientific well-foundedness. The language 
> is conceived as a "plug-in" language suitable for use in three 
> different areas: (1) manual annotation of data; (2) automatic 
> recognition of emotion-related states from user behavior; and (3) 
> generation of emotion-related system behavior.
> The specification is now available as follows.
> This version:
>   http://www.w3.org/TR/2013/PR-emotionml-20130416/

> Latest version:
>   http://www.w3.org/TR/emotionml/

> Previous version:
>   http://www.w3.org/TR/2012/CR-emotionml-20120510/

> Comments are welcome and to be sent to <www-multimodal@w3.org> through
> 14 May.
> Learn more about the Multimodal Interaction Activity by visiting the 
> group's public page at:
>   http://www.w3.org/2002/mmi/

> Thank you,
> Kazuyuki Ashimura
> for the W3C Multimodal Interaction Working Group Chair

Received on Monday, 22 April 2013 14:22:41 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 22:42:44 UTC