A mathematical theory of deep neural networks
- đ¤ Speaker: Prof. Helmut Bolcskei, ETH Zurich đ Website
- đ Date & Time: Friday 22 May 2020, 12:00 - 13:00
- đ Venue: Department of Engineering - LT1
Abstract
During the past decade deep neural networks have led to spectacular successes in a wide range of applications such as image classification and annotation, handwritten digit recognition, speech recognition, and game intelligence. In this talk, we describe efforts to develop a mathematical theory that can explain these impressive practical achievements and possibly guide future deep learning architectures and algorithms. Specifically, we develop the fundamental limits of learning in deep neural networks by characterizing what is possible in principle. We then attempt to explain the inner workings of deep generative networks and of scattering networks. A brief survey of recent results on deep networks as solution engines for PDEs is followed by considerations of interesting open problems and philosophical remarks on the role of mathematics in AI research.
Series This talk is part of the Information Engineering Distinguished Lecture Series series.
Included in Lists
- All Talks (aka the CURE list)
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge talks
- Cambridge University Engineering Department Talks
- Centre for Smart Infrastructure & Construction
- Chris Davis' list
- Computational Continuum Mechanics Group Seminars
- Department of Engineering - LT1
- Featured lists
- Information Engineering Distinguished Lecture Series
- Information Engineering Division seminar list
- Interested Talks
- ndk22's list
- ob366-ai4er
- Probabilistic Systems, Information, and Inference Group Seminars
- rp587
- School of Technology
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Prof. Helmut Bolcskei, ETH Zurich 
Friday 22 May 2020, 12:00-13:00