Symmetrized KL information: Channel Capacity and Learning Theory
- π€ Speaker: Dr Gholamali Aminian, Alan Turing Institute π Website
- π Date & Time: Wednesday 06 November 2024, 14:00 - 15:00
- π Venue: MR5, CMS Pavilion A
Abstract
This talk explores the interesting applications of symmetrized KL information across information theory and machine learning. We begin by introducing this information measure and its key properties. We then demonstrate its utility in deriving the upper bound for Poisson channel capacity. Moving to learning theory, we show how symmetrized KL information provides novel insights into generalization error analysis. In particular, we present an exact characterization of the Gibbs algorithm ( a.k.a. Gibbs Posterior) in supervised learning using this measure, followed by novel upper bound on its generalization error. The talk concludes with an application to one-hidden-layer neural networks, where we leverage symmetrized KL divergence to establish generalization bounds for one-hidden-layer neural networks in the mean-field regime.
Series This talk is part of the Information Theory Seminar series.
Included in Lists
- All CMS events
- All Talks (aka the CURE list)
- bld31
- CMS Events
- DPMMS info aggregator
- DPMMS lists
- DPMMS Lists
- Hanchen DaDaDash
- Information Theory Seminar
- Interested Talks
- MR5, CMS Pavilion A
- School of Physical Sciences
- Statistical Laboratory info aggregator
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Dr Gholamali Aminian, Alan Turing Institute 
Wednesday 06 November 2024, 14:00-15:00