Approximating the Kullback-Leibler Divergence Between GMMs
- đ¤ Speaker: Rogier van Dalen (University of Cambridge)
- đ Date & Time: Thursday 23 October 2008, 14:00 - 15:30
- đ Venue: Engineering Department, CBL Room 438
Abstract
Approximating the Kullback-Leibler divergence between Gaussian mixture models http://pederao.googlepages.com/KLdiv.pdf
The Kullback-Leibler divergence is a widely used tool. Computing the KL divergence between two Gaussian mixture models analytically is not computationally tractable. Hershey and Olsen (IEEE International Conference on Acoustics, Speech, and Signal Processing 2007, http://pederao.googlepages.com/KLdiv.pdf) discuss various ways of approximating it.
Series This talk is part of the Machine Learning Reading Group @ CUED series.
Included in Lists
- All Talks (aka the CURE list)
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Cambridge University Engineering Department Talks
- Centre for Smart Infrastructure & Construction
- Chris Davis' list
- Computational Continuum Mechanics Group Seminars
- custom
- Engineering Department, CBL Room 438
- Featured lists
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group Journal Clubs
- Inference Group Summary
- Information Engineering Division seminar list
- Interested Talks
- Machine Learning Reading Group
- Machine Learning Reading Group @ CUED
- Machine Learning Summary
- ML
- ndk22's list
- ob366-ai4er
- Quantum Matter Journal Club
- Required lists for MLG
- rp587
- School of Technology
- Simon Baker's List
- TQS Journal Clubs
- Trust & Technology Initiative - interesting events
- yk373's list
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 23 October 2008, 14:00-15:30