BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Differentially private M-estimation via noisy optimization - Po-Li
 ng Loh
DTSTART:20240425T140000Z
DTEND:20240425T150000Z
UID:TALK213550@talks.cam.ac.uk
CONTACT:Catarina
DESCRIPTION:We present a noisy composite gradient descent algorithm for di
 fferentially private statistical estimation in high dimensions. We begin b
 y providing general rates of convergence for the parameter error of succes
 sive iterates under assumptions of local restricted strong convexity and l
 ocal restricted smoothness. Our analysis is local\, in that it ensures a l
 inear rate of convergence when the initial iterate lies within a constant-
 radius region of the true parameter. At each iterate\, multivariate Gaussi
 an noise is added to the gradient in order to guarantee that the output sa
 tisfies Gaussian differential privacy. We then derive consequences of our 
 theory for linear regression and mean estimation. Motivated by M-estimator
 s used in robust statistics\, we study loss functions which downweight the
  contribution of individual data points in such a way that the sensitivity
  of function gradients is guaranteed to be bounded\, even without the usua
 l assumption that our data lie in a bounded domain. We prove that the obje
 ctive functions thus obtained indeed satisfy the restricted convexity and 
 restricted smoothness conditions required for our general theory. We then 
 show how the private estimators obtained by noisy composite gradient desce
 nt may be used to obtain differentially private confidence intervals for r
 egression coefficients\, by leveraging work in Lasso debiasing proposed in
  high-dimensional statistics. We complement our theoretical results with s
 imulations that illustrate the favorable finite-sample performance of our 
 methods.
LOCATION:Venue to be confirmed
END:VEVENT
END:VCALENDAR
