Some Applications of the Kullback-Leibler Divergence Rate in Hidden Markov Models
- đ¤ Speaker: Dr. M. Vidyasagar (Executive Vice President, Tata Consultancy Services)
- đ Date & Time: Wednesday 09 April 2008, 11:30 - 12:30
- đ Venue: Cambridge University Engineering Department, Lecture Room 6
Abstract
The Kullback-Leibler (K-L) divergence rate between stochastic processes is a generalization of the familiar K-L divergence between probability vectors. In this talk, the K-L divergence rate is introduced, and an easy derivation is given of the K-L divergence rate between two Markov chains over a common alphabet. This formula is used to solve the problem of approximating a Markov chain with “long” memory by another with “short” memory in an optimal fashion. The difficulties in extending this result to HMMs are explained. Finally, a geometrically convergent estimate for the K-L divergence rate between two HMMs is provided.
Series This talk is part of the CUED Control Group Seminars series.
Included in Lists
- All Talks (aka the CURE list)
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge talks
- Cambridge University Engineering Department, Lecture Room 6
- Cambridge University Engineering Department Talks
- Centre for Smart Infrastructure & Construction
- Chris Davis' list
- Computational Continuum Mechanics Group Seminars
- CUED Control Group Seminars
- Featured lists
- Information Engineering Division seminar list
- Interested Talks
- ndk22's list
- ob366-ai4er
- Probabilistic Systems, Information, and Inference Group Seminars
- rp587
- School of Technology
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Wednesday 09 April 2008, 11:30-12:30