BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Symmetrized KL information: Channel Capacity and Learning Theory -
  Dr Gholamali Aminian\, Alan Turing Institute
DTSTART:20241106T140000Z
DTEND:20241106T150000Z
UID:TALK219907@talks.cam.ac.uk
CONTACT:Prof. Ramji Venkataramanan
DESCRIPTION: This talk explores the interesting applications of symmetrize
 d KL information across information theory and machine learning. We begin 
 by introducing this information measure and its key properties. We then de
 monstrate its utility in deriving the upper bound for Poisson channel capa
 city. Moving to learning theory\, we show how symmetrized KL information p
 rovides novel insights into generalization error analysis. In particular\,
  we present an exact characterization of the Gibbs algorithm ( a.k.a. Gibb
 s Posterior) in supervised learning using this measure\, followed by novel
  upper bound on its generalization error. The talk concludes with an appli
 cation to one-hidden-layer neural networks\, where we leverage symmetrized
  KL divergence to establish generalization bounds for one-hidden-layer neu
 ral networks in the mean-field regime.
LOCATION:MR5\, CMS Pavilion A
END:VEVENT
END:VCALENDAR
