W3C home > Mailing lists > Public > public-xg-emotion@w3.org > October 2006

Emotion AI / Attention AI use cases

From: Ian Wilson <ian@neon.ai>
Date: Mon, 09 Oct 2006 11:40:15 +0900
Message-ID: <4529B68F.3070100@neon.ai>
To: public-xg-emotion@w3.org
Dear All,

Firstly let me apologize for the timing of my submission.

I have 3 submissions that are in the enclosed document and I have added 
to the wiki.

I am also adding the text here:


      Emotion AI / Attention AI Use Cases for W3C Emotion XG


      Use case 3c: Event based triggering of generated emotional
      behavior for real time characters, for example web based user avatars

(i) Ian has developed an engine that generates facial gestures, body 
gestures and actions that are consistent with a given characters age, 
gender and personality. In the application of a web based visual 
representation of a real person we would like to allow users to add 
those visual representations of their friends to their blog or web site 
for example.

   2.

      In order for each character to represent its own user it needs to
      update the visual representation, this can be achieved based on
      received "event" data from the user. Using this data a locally
      installed emotion engine can drive a 3D character for example to
      represent the emotional state of a friend.

   3.

      Events would be generated remotely, for example by actions taken
      by the friend being represented, these events would be sent to the
      users local emotion engine which would process the events, update
      the model of the friends emotional state (emotion dimensions) and
      then map those dimensional values to facial gesture, body gesture
      parameters and actions.

*Requirements:*

/character configuration / users description:/

    *

      personality (emotion) dimensions

    *

      age

    *

      gender

/current event data:/

    *

      reward value

    *

      penalty value

    *

      confidence value

    *

      id

    *

      time stamp

    *

      category (context)

    *

      type


      Use case 3d: Directly driving the emotional behavior of real time
      characters, for example web based user avatars

(i) Ian has developed an engine that generates facial gestures, body 
gestures and actions that are consistent with a given characters age, 
gender and personality. In the application of a web based visual 
representation of a real person we would like to allow users to add 
those visual representations of their friends to their blog or web site 
for example.

   2.

      In order for each character to represent its own user it needs to
      update the visual representation, this can be achieved based on
      data received that directly drives the facial gestures, body
      gestures and actions. Using this data a locally installed emotion
      engine can drive a 3D character for example to represent the
      emotional state of a friend.

   3.

      A remote emotion engine (on the friends system for example) would
      generate emotion dimension parameters, the user would receive this
      data and their local emotion engine would map those dimensions to
      gesture parameters and actions and use that data to update the
      visual representation of their friend.

*Requirements:*

/emotional behavior data:/

    *

      personality (emotion) dimensions


      Use case 3e: Event based triggering of Attention filtering to
      prioritize interesting stock movements

(i) Ian has developed an engine that uses a core functional property of 
emotional behavior, to prioritize and pay attention to important real 
time events within a stream of complex events, and wishes to apply this 
system to the task of prioritizing real time stock quotes and alerting 
users to data they, personally, would find important, surprising and 
interesting.

   2.

      A user would personalize the system to match their own personality
      (or a different one should they so wish) so the systems behavior
      would roughly match the users own were they physically monitoring
      the real time stream of stock data. The system would present the
      user with only that information it determined to be interesting at
      any point in time. The presentation of data could be from a simple
      text alert to a more complex visual representation.

   3.

      A central server could receive the stream of real time events,
      assign values to each and then send those packaged events to each
      user where their own, personally configured, system would
      determine the importance of that particular event to that
      particular user.

*Requirements:*

/system configuration / users description:/

    *

      personality (emotion) dimensions

    *

      age

    *

      gender

/current event data:/

    *

      reward value

    *

      penalty value

    *

      confidence value

    *

      id

    *

      time stamp

    *

      category (context)

    *

      type


Best Regards,

Ian Wilson
Received on Monday, 9 October 2006 16:42:49 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Tuesday, 8 January 2008 14:21:16 GMT