Synthesizing Expressions using Facial Feature Point Tracking: How Emotion is Conveyed
- đ¤ Speaker: Tadas Baltrusaitis (University of Cambridge)
- đ Date & Time: Thursday 14 October 2010, 14:15 - 15:15
- đ Venue: Computer Laboratory, William Gates Building, Room SS03
Abstract
Many approaches to the analysis and synthesis of facial expressions rely on automatically tracking landmark points on human faces. However, this approach is usually chosen because of ease of tracking rather than its ability to convey affect. We have conducted an experiment that evaluated the perceptual importance of 22 such automatically tracked feature points in a mental state recognition task. The experiment compared mental state recognition rates of participants who viewed videos of human actors and synthetic characters (physical android robot, virtual avatar, and virtual stick figure drawings) enacting various facial expressions.
In this talk I will present the results of our experiment and the implications they have for facial feature analysis and synthesis.
Series This talk is part of the Rainbow Interaction Seminars series.
Included in Lists
- Cambridge talks
- Computer Laboratory, William Gates Building, Room SS03
- Human-Computer Interaction
- Rainbow Interaction Seminars
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 14 October 2010, 14:15-15:15