BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Differential Privacy and Probabilistic Inference - Oliver Williams
  (Microsoft Research)
DTSTART:20100621T100000Z
DTEND:20100621T110000Z
UID:TALK25096@talks.cam.ac.uk
CONTACT:Emli-Mari Nel
DESCRIPTION:Differential privacy is a recent privacy definition that permi
 ts only indirect observation of data held in a database through noisy meas
 urement. I will show that there is a strong connection between differentia
 l privacy and probabilistic inference. Previous research on learning from 
 data protected by differential privacy has been driven by researchers inve
 nting sophisticated learning algorithms which are applied directly to the 
 data and output model parameters which can be proven to\nrespect the priva
 cy of the data set. Proving these privacy properties requires an intricate
  analysis of each algorithm on a case-by-case basis. While this does resul
 t in many valuable algorithms and results\, it is not a scalable solution 
 for two reasons: first\, to solve a new learning problem\, one must invent
  and analyze a new privacy-preserving algorithm\; second\, one must then c
 onvince the owner of the data to run this algorithm. Both of these steps a
 re challenging. In contrast\, I will consider the potential of applying pr
 obabilistic  inference to the measurements and measurement process to deri
 ve posterior distributions over the data sets and model parameters thereof
 \, showing that for the models investigated\, probabilistic inference can 
 improve accuracy\, integrate multiple observations\, measure uncertainty\,
  and even provide posterior distributions over quantities that were not di
 rectly measured.
LOCATION:TCM Seminar Room\, Cavendish Laboratory\, Department of Physics
END:VEVENT
END:VCALENDAR
