W3C home > Mailing lists > Public > www-multimodal@w3.org > April 2013

Re: Emotion Markup Language (EmotionML) 1.0 is a Proposed Recommendation

From: Alexandre Denis <alexandre.denis@loria.fr>
Date: Mon, 22 Apr 2013 17:23:45 +0200
Message-ID: <CAPYqdFe9juj-v7_E6aDKsnYEubwjbSnvObRYBDWckkHQPnJEdA@mail.gmail.com>
To: Felix.Burkhardt@telekom.de
Cc: www-multimodal@w3.org
Hi Felix,
thanks for your answer. I realized I was not 100% correct about my
comments, for instance, the number of <info> in assertion 105 is indeed
checked in the schema, so does <trace>, I think I read the schema too
quickly. The comment about the "required" status in assertion 105, that is
wrt the number of <info> elements may still hold: it maybe should be stated
"The <emotionml> element MUST contain one or zero <info> element".

I will send an updated list of comments once I publish the implementation
because I have found other issues (most of them are minor discrepancies
between the specification, the assertions and the schema). I will also add
the implementation description when it's done. Few words about it: it's a
pure Java implementation developed in the context of the ITEA Empathic
Products project. I'm working at the LORIA laboratory in Nancy, France. The
idea is to propose a convenient implementation of EmotionML whose
conformance to the standard is emphasized by a two steps validation, a
schema validation and an assertional validation. Since the schema is not
able to test all assertions, I've written a custom validator that checks
all the assertions listed in the implementation report. To double check the
implementation, I've manually designed a corpus of erroneous EmotionML
files (one per assertion) that have to be invalidated by the validator. I'm
still in the process of putting everything together, adding missing
assertions, double checking the code, etc. I miss two features right now,
mime-type checking and id unicity. It's possible that I may release the
implementation by the end of the week, under an open source license,

best regards,

On Mon, Apr 22, 2013 at 4:22 PM, <Felix.Burkhardt@telekom.de> wrote:

> Hi Alexandre
> Thanks for reviewing EmotionML and sending this and the previous mail [1].
> We will review your comments in the group, and perhaps find ways to
> enhance the spec, the vocabularies and/or the schema.
> You're certainly right about the erroneous vocabulary document.
> Meanwhile, can you give us more information about the implementation
> you're doing?
> It might be a good idea to add you (and other implementers of EmotionML)
> to our  Implementations page [2].
> Kind regards,
> Felix
> [1] http://lists.w3.org/Archives/Public/www-multimodal/2013Apr/0002.html
> [2] http://www.w3.org/2002/mmi/implementations.html
> -----Ursprüngliche Nachricht-----
> Von: Alexandre Denis [mailto:alexandre.denis@loria.fr]
> Gesendet: Freitag, 19. April 2013 12:49
> An: www-multimodal@w3.org; ashimura@w3.org
> Betreff: Re: Emotion Markup Language (EmotionML) 1.0 is a Proposed
> Recommendation
> Hi,
> thanks for this notice. I have already sent few comments in a previous
> message (2013-04-18). Moreover I found another problem while verifying all
> assertions. The vocabularies document that is accessible on
> http://www.w3.org/TR/emotion-voc/xml does not comply with EmotionML
> format since the <emotionml> does not have a version attribute (assertions
> 110 and 111 fail) hence all the examples found in the specification that
> refer to this document fail the validation, best regards, Alexandre
> Le 18/04/13 22:31, Kazuyuki Ashimura a écrit :
> > Hi www-multimodal,
> >
> > W3C is pleased to announce the advancement of Emotion Markup Language
> > (EmotionML) 1.0 to Proposed Recommendation on April 16.
> >
> > As the web is becoming ubiquitous, interactive, and multimodal,
> > technology needs to deal increasingly with human factors, including
> > emotions.  This specification aims to strike a balance between
> > practical applicability and scientific well-foundedness. The language
> > is conceived as a "plug-in" language suitable for use in three
> > different areas: (1) manual annotation of data; (2) automatic
> > recognition of emotion-related states from user behavior; and (3)
> > generation of emotion-related system behavior.
> >
> > The specification is now available as follows.
> >
> > This version:
> >   http://www.w3.org/TR/2013/PR-emotionml-20130416/
> >
> > Latest version:
> >   http://www.w3.org/TR/emotionml/
> >
> > Previous version:
> >   http://www.w3.org/TR/2012/CR-emotionml-20120510/
> >
> > Comments are welcome and to be sent to <www-multimodal@w3.org> through
> > 14 May.
> >
> > Learn more about the Multimodal Interaction Activity by visiting the
> > group's public page at:
> >   http://www.w3.org/2002/mmi/
> >
> > Thank you,
> >
> > Kazuyuki Ashimura
> > for the W3C Multimodal Interaction Working Group Chair
> >
Received on Monday, 22 April 2013 15:24:25 UTC

This archive was generated by hypermail 2.3.1 : Tuesday, 6 January 2015 20:06:38 UTC