BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:The Role of Information Measures on the Regularization of Empirica
 l Risk Minimization - Dr Iñaki Esnaola\, University of Sheffield
DTSTART:20231129T140000Z
DTEND:20231129T150000Z
UID:TALK205339@talks.cam.ac.uk
CONTACT:Prof. Ramji Venkataramanan
DESCRIPTION:The empirical risk minimization problem with relative entropy 
 regularization (ERM-RER) is presented considering that the reference measu
 re is a $\\sigma$-finite measure instead of a probability measure. This ge
 neralization allows for a larger degree of flexibility in the incorporatio
 n of prior knowledge over the set of models. We discuss the interplay of t
 he regularization parameter\, the reference measure\, the risk measure\, a
 nd the expected empirical risk induced by the solution of the ERM-RER prob
 lem\, which is proved to be unique. We show that the expectation of the se
 nsitivity is upper bounded\, up to a constant factor\, by the square root 
 of the lautum information between the models and the datasets. Using these
  tools\, dataset aggregation is studied and different figures of merit to 
 evaluate the generalization capabilities of ERM-RER are introduced. For ar
 bitrary datasets and parameters of the ERM-RER solution\, a connection bet
 ween Jeffrey’s divergence\, training\, and test error is established. We
  conclude by extending the results to $f$-divergence regularization by obt
 aining a closed form expression for the solution under mild assumptions on
  the structure of the regularizer. This analytical solution is leveraged t
 o characterize the sensitivity of the resulting supervised learning proble
 m and we evaluate the solution for specific regularizers arising in estima
 tion\, high-dimensional statistics\, and hypothesis testing. \n\n
LOCATION:MR5\, CMS Pavilion A
END:VEVENT
END:VCALENDAR
