Re: [EMOXG] xml spec subgroup: emotion-related phenomena / complex emotions

Hi,
are there any examples how to implement something like this as a user interface?
I'm team member in a location based social network startup, and we
would like to add this as an optional experimental information layer.
But at the moment I have no idea how.
The basic idea would be to allow users to rate their current emotion
at their current location.
So afterwards users could search for specific emotions near them...

Best,
Florian

On Thu, Jul 31, 2008 at 10:37 AM, Christian Peter
<Christian.Peter@igd-r.fraunhofer.de> wrote:
> Hi again,
>
> here I have to adopt the role of the lay-person wanting to make use of the
> language, having no deep knowledge on parsers or other XML-related technical
> background.
>
> Core 1:
> -------
> For me, Variant 3a, the level version
>
> <affect> <category level=0 set="Scherer" name="emotion" /> </affect>
>
> is the one I best could follow and interpret, so I personally would prefer
> this :-) The others might be technically better but difficult to interprete
> for those not that familiar with XML notations and nesting.
>
>
> Core 6 - complex emotions:
> ---------------------------
> I see your point. But I don't see a big problem in this. We do have the time
> information (if wanted) within each <emotion> entry, so time correlates can
> be found and exploited easily.
>
> So the <complex-affect> tag as container might do the job:
>
> <complex-affect>
>  <affect>
>    <category level=0 set="Scherer" name="emotion" />
>    <category level=1 set="posneg" name="positive" />
>    <category level=2 set="everyday" name="pleasure" confidence="0..9" />
>    <modality name="voice" />
>  </affect>
>  <affect>
>    <category level=0 set="Scherer" name="emotion" />
>    <category level=1 set="posneg" name="negative" />
>    <category level=2 set="everyday" name="sad" confidence="0.2" />
>    <modality name="face" />
>  </affect>
> </complex-affect>
>
> Note that the modality information is within the single emotions.
>
> This would do for a number of applications. You can add the time information
> as suggested by the other group to the composing single emotions to add
> complexity ;-)
>
> I might have overseen something, though.
>
> Best,
>
> Christian
>
> --
> Schuller schrieb:
>>
>> Dear all,
>>
>> here is my part for discussion in today's phone conference: core 1 and
>> core 6 examples (see below). As with Marc's mail today this has not been
>> discussed, yet.
>>
>> Best,
>>
>> Bjoern
>>
>> Core 1:
>> Type of Emotion related phenomena:
>> ==================================
>>
>> [Variation 1]
>>
>> To adress emotion related phenomena, the most straightforward approach is
>> to extend the <emotion> tag by adding an <emotion-related-state> tag:
>>
>> what was before (example 3b for core 2, as agreed upon as future model):
>>
>> <emotion>
>>  <category set="everyday" name="pleasure" confidence="0.9" />
>> </emotion>
>>
>> will then become:
>>
>> <emotion-related-state>
>>  <category set="Scherer" name="emotion" />
>>  <emotion>
>>    <category set="everyday" name="pleasure" confidence="0.9" />
>>  </emotion>
>> </emotion-related-state>
>>
>> or to e.g. specify a mood:
>>
>> <emotion-related-state>
>>  <category set="Scherer" name="mood" />
>>  <emotion>
>>    <category set="everyday" name="relaxed" confidence="0.8" />
>>  </emotion>
>> </emotion-related-state>
>>
>> Examples of sets as named in the requirements:
>>
>> Scherer set:
>> # Emotions
>> # Moods
>> # Interpersonal stances
>> # Preferences/Attitudes
>> # Affect dispositions
>>
>> HUMAINE set:
>>
>> # Attitudes
>> # Established emotion
>> # Emergent emotion (full-blown)
>> # Emergent emotion (suppressed)
>> # Moods
>> # Partial emotion (topic shifting)
>> # Partial emotion (simmering)
>> # Stance towards person
>> # Stance towards object/situation
>> # Interpersonal bonds
>> # Altered state of arousal
>> # Altered state of control
>> # Altered state of seriousness
>> # Emotionless
>>
>> The category specified below <emotion-related-state> indicates the
>> category of emotion related state, i.e. whether the described event is an
>> emotion, mood, interpersonal stance, etc. The set refers to the set of
>> possible states, e.g. the HUMAINE set or the one by Scherer (see above, and
>> 2.1. of http://www.w3.org/2005/Incubator/emotion/XGR-emotion/).
>>
>> [Variant 2]
>>
>> The name of the following <emotion> tag can now be confusing, since the
>> contents of the tag now describe mood instead of emotion.
>> A solution, that further yields the possibility of tree-like hierarchical
>> categorization of emotions or emotion related states, is to use one tag for
>> all levels of the hierarchy. This can however only be done if the xml
>> parsers used for emotionML allow nesting of tags with the same name. Such a
>> common tag could be called <affect> for example:
>>
>> <affect>
>>  <category set="Scherer" name="mood" />
>>  <affect>
>>    <category set="everyday" name="relaxed" confidence="0.8" />
>>  </affect>
>> </affect>
>>
>> or for an emotion:
>>
>> <affect>
>>  <category set="Scherer" name="emotion" />
>>  <affect>
>>    <category set="everyday" name="pleasure" confidence="0.9" />
>>  </affect>
>> </affect>
>>
>> This approach does not require an extra "emotion-related-state" tag.
>> Further, it allows to specify a tree-like categorization hierarchy in deep
>> structure. For example the first level is used to distinguish which type of
>> emotion related phenomenon is specified (mood, emotion, etc.), the second
>> level - instead of specifying the category of emotion, etc. directly, can
>> specify on a general level whether the emotion is positive or negative to
>> further extend. Example:
>>
>> <affect>
>>  <category set="Scherer" name="emotion" />
>>  <affect>
>>    <category set="posneg" name="positive" />
>>    <affect>
>>      <category set="everyday" name="pleasure" confidence="0.9" />
>>    </affect>
>>  </affect>
>> </affect>
>>
>>
>> An advantage of this extended hierarchy is, that if an application is only
>> interested in a general idea whether the person is conveying a positive or
>> negative emotion, the information is easily accessible. Another advantage is
>> that an application reading the <affect> tags must not know all category
>> sets, especially the more detailed ones and can still interpret parts of the
>> information.
>>
>> [Variant 3a]
>>
>> A more simple variation of Variant 2 is to avoid the nesting and specify
>> multiple categories and add a level attribute:
>>
>> <affect>
>>  <category level=0 set="Scherer" name="emotion" />
>>  <category level=1 set="posneg" name="positive" />
>>  <category level=2 set="everyday" name="pleasure" confidence="0.9" />
>>  <modality name="voice" />
>>  ...
>> </affect>
>>
>> or introduce the tags <category0> to <category9>, which makes things
>> simpler but limits the number of levels to 10:
>>
>> <affect>
>>  <category0 set="Scherer" name="emotion" />
>>  <category1 set="posneg" name="positive" />
>>  <category2 set="everyday" name="pleasure" confidence="0.9" />
>>  <modality name="voice" />
>>  ...
>> </affect>
>>
>> [Variant 3b]
>>
>> A suggestion on how to specify the tree-like categories more compact is to
>> use a certain style for the category names, i.e. where the categories are
>> separated by (:)
>>
>>  <category0 set="Scherer" name="emotion" />
>>  <category1 set="posneg" name="positive" />
>>  <category2 set="everyday" name="pleasure" confidence="0.9" />
>>
>> would be combined to one line:
>>
>>  <category set="Scherer:posneg:everday" name="emotion:positive:pleasure"
>> />
>>
>> In order to avoid great overhead of having to specify the full set each
>> time an emotion is to be specified,
>> meta-sets (like variables) could be defined at the beginning:
>>  <meta-set name="spe" value="Scherer:posneg:everday">
>> the above category line then would become:
>>  <category meta-set="spe" name="emotion:positive:pleasure" />
>>
>> Disadvantage of Variant 3 is the lack of generality. Variant 2 would
>> theoretically allow for specification of multiple <affect> elements below a
>> <category> tag on the same level indicating different sub-categories for
>> different modalities, for example. Therefore Variant 2 is the favored if
>> hierarchical emotion tagging is of interest. Other than that variant 1 is
>> the most "straight-forward".
>>
>> However, at the same time, in a final standard both variants could
>> co-exist and maybe should? If the user requires compact representation, and
>> is fine with the restrictions, then she should be free to use variant 3a or
>> even variant 3b. Variant 2 could be supported because it provides greater
>> flexibility.
>>
>> Core 6:
>> Multiple and/or complex emotions:
>> =================================
>>
>> Due to the complexity of this subject it is too restrictive to use tags to
>> combine emotions (as is done in EARL with the <complex-emotion> tag) as only
>> mechanism to deal with complex emotions.
>> For a general and very flexible specification to deal with complex
>> emotions, the most simple method from the language specification point of
>> view is to add a timestamp and duration attribute to every <affect> or
>> <emotion-related-state> tag (whatever it will be called, see draft for
>> emotion related phenomena). For a complex emotion multiple <affect> tags
>> with the same time attributes can be specified. Of course, this sets higher
>> demands for the parsing application. It must internally align all parsed
>> emotion/affect events on a timeline and then combine events that occur
>> simultaneously. However, the flexibility of this approach is enormous.
>> Complex emotions, where one part begins earlier than the other or one
>> emotion is suppressed only at certain times, can be annotated without
>> hassle.
>>
>> To make parsing and processing of complex emotions easier it might,
>> however, be necessary to add a <complex-emotion> or <complex-affect>
>> container that can group together multiple emotion related phenomena
>> occurring in parallel. Example:
>>
>> [Variant 1]
>>
>> <complex-affect>
>>  <affect>
>>    <category set="Scherer" name="emotion" />
>>    <affect>
>>      <category set="everyday" name="pleasure" confidence="0.9" />
>>    </affect>
>>  </affect>
>>  <affect>
>>    <category set="Scherer" name="emotion" />
>>    <affect>
>>      <category set="everyday" name="anger" confidence="0.9" />
>>    </affect>
>>  </affect>
>> </complex-affect>
>>
>> [Variant 2]
>>
>> Another possibility is to link one affect tag to another, not using a
>> container tag. Let us use a <related link="#id to link to" /> tag here to
>> illustrate the concept but not propose this as a good solution. Probably the
>> tag should be changed to something more meaningful or existing ways of
>> linking should be used here to ensure easier compatibility.
>>
>>  <affect id=1>
>>    <related link="#2" />
>>    <category set="Scherer" name="emotion" />
>>    <affect>
>>      <category set="everyday" name="pleasure" confidence="0.9" />
>>    </affect>
>>  </affect>
>>  <affect id=2>
>>    <related link="#1" />
>>    <category set="Scherer" name="emotion" />
>>    <affect>
>>      <category set="everyday" name="anger" confidence="0.9" />
>>    </affect>
>>  </affect>
>>
>> All of the above methods could be supported in parallel in the standard,
>> because they might all have advantages for specific applications/parsers..
>>
>>
>>
>> -------------------------------------------
>> Dr. Björn Schuller
>> Lecturer
>> Technische Universität München
>> Institute for Human-Machine Communication
>> Theresienstraße 90
>> Building N1, ground level
>> Room N0135
>> D-80333 München
>> Germany
>> Fax: ++49 (0)89 289-28535
>> Phone: ++49 (0)89 289-28548
>> schuller@tum.de
>> www.mmk.ei.tum.de/~sch
>> -------------------------------------------
>> This message is confidential. It may also be privileged or otherwise
>> protected by work product immunity or other legal rules. If you have
>> received it by mistake please let us know by reply and then delete it from
>> your system; you should not copy it or disclose its contents to anyone. All
>> messages sent to and from our institute may be monitored to ensure
>> compliance with internal policies and to protect our business. Emails are
>> not secure and cannot be guaranteed to be error free as they can be
>> intercepted, amended, lost or destroyed, or contain viruses. Anyone who
>> communicates with us by email is taken to accept these risks.
>>
>
> --
> ------------------------------------------------------------------------
> Christian Peter
> Fraunhofer Institute for Computer Graphics Rostock
> Usability and Assistive Technologies
> Joachim-Jungius-Str. 11, 18059 Rostock, Germany
> Phone: +49 381 4024-122, Fax: +49 381 4024-199
> email: christian.peter@igd-r.fraunhofer.de
> ------------------------------------------------------------------------
> Problems with the electronic signature? Please load the current root
> certificate of the Fraunhofer-Gesellschaft into your browser!
> (http://pki.fraunhofer.de/EN/)
> ------------------------------------------------------------------------
>
>



-- 
ideas online - Ideen, Beratung & Konzeption
www.id-o.de | bailey@id-o.de | 09131-9201018

Received on Thursday, 31 July 2008 12:28:05 UTC