BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Optimal integration of top-down and bottom-up uncertainty in human
 s\, monkeys\, and neural networks - Ahmad Qamar (University of Chicago)
DTSTART:20120620T140000Z
DTEND:20120620T150000Z
UID:TALK38682@talks.cam.ac.uk
CONTACT:Zoubin Ghahramani
DESCRIPTION:Hearing the sound of rustling leaves in the forest might indic
 ate the presence of a predator or just be the effect of the wind. There ar
 e two difficulties in determining the source of the sound. First\, both so
 urces can cause similar sounds\, and second\, the sound is corrupted by se
 nsory noise. In this common decision paradigm\, uncertainty associated wit
 h previous knowledge (top-down uncertainty) is combined with uncertainty a
 ssociated with sensory information from the environment (bottom-up uncerta
 inty). The capability of the brain to make decisions under both types of u
 ncertainty is key for survival. We studied this kind of decision-making in
  both humans and macaque monkeys using an orientation classification task.
  Stimuli were oriented patterns whose orientations were drawn from either 
 a narrow or a wide normal distribution\, with the same means. Each distrib
 ution defined a class. Top-down uncertainty stemmed from the overlap of th
 e two distributions. Bottom-up uncertainty was manipulated through the con
 trast of the stimulus. The Bayes-optimal observer in this task chooses\, o
 n each trial\, the class with the highest posterior probability. Computing
  the posterior is nontrivial because marginalization over stimulus orienta
 tion is required. The optimal strategy amounts to using a decision criteri
 on that varies according to the trial-by-trial bottom-up uncertainty. Baye
 sian model comparison revealed that this model describes human and monkey 
 data better than models with non-optimal criteria. We proceeded to constru
 ct a neural network that behaves like the optimal observer under biologica
 lly plausible forms of neural variability. Within the framework of Poisson
 -like population codes\, we trained neural networks with different types o
 f operations to approximate the optimal posterior over class. A network wi
 th divisive normalization operations was sufficient to perform well in thi
 s non-trivial task. Our results demonstrate that humans\, monkeys\, and ne
 ural networks can optimally integrate top-down and bottom-up uncertainty.
LOCATION:Engineering Department\, CBL Room 438
END:VEVENT
END:VCALENDAR
