BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:An Entropy-Based Model for Hierarchical Learning - Dr Amir Asadi\,
  University of Cambridge
DTSTART:20230208T140000Z
DTEND:20230208T150000Z
UID:TALK192821@talks.cam.ac.uk
CONTACT:Prof. Ramji Venkataramanan
DESCRIPTION:Machine learning is the dominant approach to artificial intell
 igence\, through which computers learn from data and experience. In the fr
 amework of supervised learning\, a necessity for a computer to learn from 
 data accurately and efficiently is to provide it with auxiliary informatio
 n about the data distribution and target function through the learning mod
 el. This notion of auxiliary information relates to the concept of regular
 ization in statistical learning theory. A common feature among real-world 
 datasets is that data domains are multiscale and target functions are well
 -behaved and smooth. In this talk\, we propose an entropy-based learning m
 odel that exploits this data structure and discuss its statistical and com
 putational benefits. The hierarchical learning model is inspired by the lo
 gical and progressive easy-to-hard learning mechanism of human beings and 
 has interpretable levels. The model apportions computational resources acc
 ording to the complexity of data instances and target functions. This prop
 erty can have multiple benefits\, including higher inference speed and com
 putational savings in training a model for many users or when training is 
 interrupted. We provide a statistical analysis of the learning mechanism u
 sing multiscale entropies and show that it can yield significantly stronge
 r guarantees than uniform convergence bounds.
LOCATION:MR5\, CMS Pavilion A
END:VEVENT
END:VCALENDAR
