BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Isotonic regression in general dimensions - Tengyao Wang (Universi
 ty of Cambridge)
DTSTART:20180308T110000Z
DTEND:20180308T120000Z
UID:TALK102274@talks.cam.ac.uk
CONTACT:INI IT
DESCRIPTION:We study the least squares regression function estimator over 
 the class of real-valued functions on [0\,1]^d that are increasing in each
  coordinate.&nbsp\; For uniformly bounded signals and with a fixed\, cubic
  lattice design\, we establish that the estimator achieves the minimax rat
 e of order n^{-min(2/(d+2)\,1/d)} in the empirical L_2 loss\, up to poly-l
 ogarithmic factors.&nbsp\; Further\, we prove a sharp oracle inequality\, 
 which reveals in particular that when the true regression function is piec
 ewise constant on $k$ hyperrectangles\, the least squares estimator enjoys
  a faster\, adaptive rate of convergence of (k/n)^{min(1\,2/d)}\, again up
  to poly-logarithmic factors.&nbsp\; Previous results are confined to the 
 case d&le\;2.&nbsp\; Finally\, we establish corresponding bounds (which ar
 e new even in the case d=2) in the more challenging random design setting.
 &nbsp\; There are two surprising features of these results: first\, they d
 emonstrate that it is possible for a global empirical risk minimisation pr
 ocedure to be rate optimal up to poly-logarithmic factors even when the co
 rresponding entropy integral for the function class diverges rapidly\; sec
 ond\, they indicate that the adaptation rate for shape-constrained estimat
 ors can be strictly worse than the parametric rate.  <br><br><br><br>
LOCATION:Seminar Room 2\, Newton Institute
END:VEVENT
END:VCALENDAR
