BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Deep Learning in High Dimension: Neural Network Approximation of A
 nalytic Functions in $L^2(\\mathbb{R}^d)$ - Christoph Schwab (ETH Zürich)
DTSTART:20211116T120000Z
DTEND:20211116T123000Z
UID:TALK165403@talks.cam.ac.uk
DESCRIPTION:For artificial deep neural networks\, we prove expression rate
 s for analytic functions\n$f:\\mathbb{R}^d \\to \\mathbb{R}$ in $L^2(\\mat
 hbb{R}^d)$ where the dimension $d$ could be infinite\, and where $L^2$ is 
 with respect to gaussian measure.&nbsp\;\nWe consider $\\mbox{ReLU}$ and $
 \\mbox{ReLU}^k$ activations for integer $k\\geq 2$.\n&nbsp\;In the infinit
 e-dimensional case\, under suitable smoothness and sparsity assumptions on
  $f:\\mathbb{R}^{\\mathbb{N}}\\to \\mathbb{R}$\, with $\\gamma_\\infty$ de
 noting an infinite (Gaussian) product measure on $(\\mathbb{R}^{\\mathbb{N
 }}\, {\\mathcal B}(\\mathbb{R}^{\\mathbb{N}}))$\, we prove dimension-indep
 endent&nbsp\; DNN expression rate bounds in the norm $L^2(\\mathbb{R}^{\\m
 athbb{N}} \, \\gamma_\\infty)$. The DNN expression rates are not subject t
 o the CoD\, and depend on summability of Wiener-Hermite expansion coeffici
 ents of $f$. Sufficient conditions are quantified holomorphy of (an analyt
 ic continuation of) the map $f$ on a product of strips in the complex doma
 in.\nAs application\, we prove DNN expression rate bounds of deep $\\mbox{
 ReLU}$-NNs for response surfaces of elliptic PDEs with log-gaussian random
  field inputs.\n(joint work with Jakob Zech\, University of Heidelberg\, G
 ermany)
LOCATION:Seminar Room 1\, Newton Institute
END:VEVENT
END:VCALENDAR
