BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Neighbourhood Components Analysis - Sam Roweis\, University of Tor
 onto
DTSTART:20050607T150000Z
DTEND:20050607T160000Z
UID:TALK4380@talks.cam.ac.uk
CONTACT:Phil Cowans
DESCRIPTION:Say you want to do K-Nearest Neighbour classification. Besides
 \nselecting K\, you also have to chose a distance function\, in order\nto 
 define "nearest". I'll talk about a two new methods for *learning*\n-- fro
 m the data itself -- a distance measure to be used in\nKNN classification.
  One algorithm\, Neighbourhood Components\nAnalysis (NCA) directly maximiz
 es a stochastic variant of the\nleave-one-out KNN score on the training se
 t. The other (just\nsubmitted to NIPS!) tries to collapse all points in th
 e same\nclass as close together as possibe. Both algorithms can also\nlear
 n a low-dimensional linear embedding of labeled data that can\nbe used for
  data visualization and very fast classification in high\ndimensions. Of c
 ourse\, the resulting classification model is non-parametric\,\nmaking no 
 assumptions about the shape of the class distributions or\nthe boundaries 
 between them.\n(Joint work with Jacob Goldberger and Amir Globerson)
LOCATION:Room 911\, Cavendish Laboratory
END:VEVENT
END:VCALENDAR
