BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Movement expressivity analysis in affective computers: from recogn
 ition to expression of emotion - Dr. Ginevra Castellano\, Queen Mary Unive
 rsity of London
DTSTART:20081113T141500Z
DTEND:20081113T151500Z
UID:TALK15090@talks.cam.ac.uk
CONTACT:Laurel D. Riek
DESCRIPTION:One of the challenging issues in human-computer interaction is
  to endow         \na machine with an emotional\, affective intelligence. 
 Emotionally                \nsensitive systems must be able to create an a
 ffective interaction with          \nusers: they must be provided with the
  ability to recognise\, express and         \nregulate emotions. Computers
  endowed with these abilities are called            \naffective computers.
  This talk investigates the role of human movement          \nexpressivity
  as a source of affective information. It focuses on video-based     \nana
 lysis algorithms\, techniques and approaches for using movement           
     \nexpressivity                                                        
             \ninformation for the development of different paradigms of in
 teraction in        \naffective computers. First\, a computational approac
 h for emotion                \nrecognition based on the expressivity of hu
 man movement is presented.           \nExpressive motion cues (e.g.\, move
 ment qualities such as the quantity of        \nmotion\, fluidity\, etc.) 
 are considered as a source of affective                 \ninformation. A m
 odel that defines features retaining information about          \nthe temp
 oral dynamics of expressive motion cues is proposed. Second\, we         \
 npresent the design of an affective interaction between a user and an     
        \nEmbodied Conversational Agent based on movement expressivity. Mov
 ement          \nexpressivity is used to increase the user's engagement by
  generating an         \naffective loop: the user's expressive movement is
  analysed in real-time         \nand used to generate an expressive behavi
 our in the agent. Finally\, an          \naffective interactive system ena
 bling users to make use of the body as          \nan expressive interface 
 is presented. In this system\, movement                  \nexpressivity is
  used to control the real-time generation of audio-visual        \nfeedbac
 k.
LOCATION:SS03
END:VEVENT
END:VCALENDAR
