BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Learning with nonparametric dependence and divergence estimation -
  Barnabas Poczos (Carnegie Mellon University)
DTSTART:20120508T100000Z
DTEND:20120508T110000Z
UID:TALK38025@talks.cam.ac.uk
CONTACT:Zoubin Ghahramani
DESCRIPTION:Estimation of dependencies and divergences are among the funda
 mental\nproblems of statistics and machine learning. While information the
 ory\nprovides standard measures for them (e.g. Shannon mutual information\
 ,\nKullback-Leibler divergence)\, it is still unknown how to estimate\nthe
 se quantities in the most efficient way. We could use density\nestimators\
 , but in high-dimensional domains they are known to suffer\nfrom the curse
  of dimensionality. Therefore\, it is of great importance\nto know which f
 unctionals of densities can be estimated efficiently in\na direct way\, wi
 thout estimating the density. Using tools from\nEuclidean random graph opt
 imization\, copula transformation\, and\nreproducing kernel Hilbert spaces
 \, we will discuss consistent\ndependence and divergence estimators that a
 void density estimation.\nThese estimators allow us to generalize classifi
 cation\, regression\,\nanomaly detection\, low-dimensional embedding\, and
  other machine\nlearning algorithms to the space of sets and distributions
 . We\ndemonstrate the power of our methods by beating the best published\n
 results on several computer vision and independent component analysis\nben
 chmarks. We also show how our perspective on learning from\ndistributions 
 allows us to define new analyses in astronomy and fluid\ndynamics simulati
 ons.
LOCATION:Engineering Department\, CBL Room BE-438
END:VEVENT
END:VCALENDAR
