BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Estimating entropy rates with confidence intervals - Matt Kennel\,
  UC San Diego
DTSTART:20070329T140000Z
DTEND:20070329T150000Z
UID:TALK6975@talks.cam.ac.uk
CONTACT:Taylan Cemgil
DESCRIPTION:The entropy rate quantifies the amount of uncertainty or disor
 der produced\nby any dynamical system. In a spiking neuron\, this uncertai
 nty\ntranslates into the amount of information potentially encoded and thu
 s\nthe subject of intense theoretical and experimental investigation. \nEs
 timating\nthis quantity in observed\, experimental data is difficult and r
 equires\na judicious selection of probabilistic models\, balancing between
  two \nopposing\nbiases.We use a model weighting principle originally deve
 loped\nfor lossless data compression\, following the minimum description l
 ength\nprinciple. This weighting yields a direct estimator of the entropy 
 rate\,\nwhich\, compared to existing methods\, exhibits significantly less
  bias and\nconverges faster in simulation.With Monte Carlo techinques\,we 
 estimate\na Bayesian confidence interval for the entropy rate. In related 
 work\, we \napply\nthese ideas to estimate the information rates between s
 ensory stimuli\nand neural responses in experimental data.\n
LOCATION:LR5\, Engineering\, Department of
END:VEVENT
END:VCALENDAR
