BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Inference in Stochastic Processes - Javier Antoran (University of 
 Cambridge)\, Matthew Ashman (University of Cambridge)\, Stratis Markou (Un
 iversity of Cambridge)
DTSTART:20210224T110000Z
DTEND:20210224T123000Z
UID:TALK156730@talks.cam.ac.uk
CONTACT:Elre Oldewage
DESCRIPTION:In parametric models\, probabilistic inference is most often a
 pproached by computing a posterior distribution over model weights. These 
 weights are then marginalised to obtain a distribution over functions and 
 make predictions. If our goal is solely to make good predictions\, an appe
 aling alternative is to directly perform inference over the ‘function-sp
 ace’ or predictive posterior distribution of our models\, without consid
 ering the posterior distribution over the weights. Using Gaussian Processe
 s (GPs) as motivation\, this talk starts by introducing a method for const
 ructing more general stochastic processes based on combining basis functio
 ns with random weights. We discuss recent research on performing approxima
 te inference in the function space of neural networks. Finally\, we provid
 e a brief introduction to Stochastic Differential Equations (SDEs). We dis
 cuss the connection of linear SDEs to GPs and Kalman filtering and smoothi
 ng\, and present a recent method for performing inference and learning in 
 nonlinear SDEs.\n\nRecommended reading\n\n# Rasmussen & Williams\, “Gaus
 sian process for Machine Learning”\, Chapter 2.2: "Function space view"\
 , pages 13-18\n# Burt et. al. "Understanding Variational Inference in Func
 tion-Space" 2020\n# Archambeau\, Cédric\, et al. "Variational inference f
 or diffusion processes.” 2008\n
LOCATION:https://eng-cam.zoom.us/j/86068703738?pwd=YnFleXFQOE1qR1h6Vmtwbno
 0LzFHdz09
END:VEVENT
END:VCALENDAR
