How do brains encode facial expression movements?
- π€ Speaker: Nick Furl
- π Date & Time: Thursday 03 November 2011, 14:15 - 15:15
- π Venue: Rainbow Room (SS03), Computer Laboratory
Abstract
Nick Furl is a neuroscientist at the MRC Cognition and Brain Sciences Unit where he uses human brain imaging to investigate face perception. Recognition of facial expressions from dynamic stimuli presents a difficult computational problem and little is known about how the brain solves it. Unfortunately, prevailing theory considers almost entirely studies using static photographs. Nickβs approach is to use video stimuli to examine how brain areas respond to facial movement information and to test how their responses relate to representations of facial expression categories. He will discuss previous neuroscience research on movement perception, including his recent study showing that facial expression categories can be decoded from movement-sensitive areas in the monkey brain. This raises a hypothesis that these movement-sensitive areas recognise expression categories from motion cues. The next step is to quantify specific motion cues from video, a field in which computer vision has already made much progress. These quantifications can then be related to imaging data to discover how the brain assembles motion information into representations of facial expression categories.
Series This talk is part of the Rainbow Group Seminars series.
Included in Lists
- All Talks (aka the CURE list)
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge talks
- Chris Davis' list
- Department of Computer Science and Technology talks and seminars
- Interested Talks
- J
- ndk22's list
- ob366-ai4er
- Rainbow Group Seminars
- Rainbow Room (SS03), Computer Laboratory
- rp587
- School of Technology
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 03 November 2011, 14:15-15:15