BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Coconut: Optimizing computations for machine learning - Andrew Fit
 zgibbon (Microsoft Research Cambridge)
DTSTART:20090929T130000Z
DTEND:20090929T140000Z
UID:TALK19665@talks.cam.ac.uk
CONTACT:Zoubin Ghahramani
DESCRIPTION:Matrix-vector notation is the predominant idiom in which machi
 ne learning formulae\nare expressed\; some models\, like Gaussian processe
 s [5]\, would be extremely\ndifficult to describe without it. Turning a ma
 trix expression into a computer program\nis not always easy\, however. Alt
 hough good implementations of primitive\nmatrix operations are available [
 2] as are packages like MATLAB [6]\, which provide\na high-level interface
  to these primitives\, two important tasks must still be\ncarried out manu
 ally: (i) computing derivatives of matrix functions and (ii) turning\na ma
 trix expression into an efficient computer program. Not having tools to do
 \nthis can and does harm research: even for the relatively simple example 
 of fitting a\nlinear regression model with gradient methods\, the number o
 f types and combinations\nof basis functions a researcher can experiment w
 ith is limited by the need to\nmanually differentiate the objective functi
 on and write code for each version. We\nhave addressed these issues by com
 bining a symbolic matrix algebra engine with\na superoptimizing compiler: 
 an interesting learning problem in itself. We call our\nsystem Coconut.\n
LOCATION:Engineering Department\, CBL Room 438
END:VEVENT
END:VCALENDAR
