BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Pseudo-Points and State-Space Gaussian Processes - Will Tebbutt (U
 niversity of Cambridge)
DTSTART:20210609T100000Z
DTEND:20210609T113000Z
UID:TALK158839@talks.cam.ac.uk
CONTACT:Elre Oldewage
DESCRIPTION:There have been a couple of interesting papers [1\, 2] combini
 ng pseudo-point approximations and Markov structure in GPs to produce even
  more scalable approximate inference techniques for long time series. We
 ’ll review these\, in particular the tricks needed to make them work. A 
 basic familiarity with GPs and pseudo-point approximations [3\, 4] would b
 e helpful\, as would having skimmed [5].\n\n \n\n[1] - Adam\, Vincent\, et
  al. "Doubly sparse variational Gaussian processes." International Confere
 nce on Artificial Intelligence and Statistics. PMLR\, 2020.\n\n[2] - Wilki
 nson\, William\, Arno Solin\, and Vincent Adam. "Sparse Algorithms for Mar
 kovian Gaussian Processes." International Conference on Artificial Intelli
 gence and Statistics. PMLR\, 2021.\n\n[3] - Titsias\, Michalis. "Variation
 al learning of inducing variables in sparse Gaussian processes." Artificia
 l intelligence and statistics. PMLR\, 2009.\n\n[4] - Hensman\, James\, Nic
 olò Fusi\, and Neil D. Lawrence. "Gaussian processes for Big data." Proce
 edings of the Twenty-Ninth Conference on Uncertainty in Artificial Intelli
 gence. 2013.\n\n[5] - Opper\, Manfred\, and Cédric Archambeau. "The varia
 tional Gaussian approximation revisited." Neural computation 21.3 (2009): 
 786-792.
LOCATION:https://eng-cam.zoom.us/j/82019956685?pwd=WUNSVVcrdC9IZGxQOHFhSTh
 jUjd2dz09
END:VEVENT
END:VCALENDAR
