BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:M-estimation\, noisy optimization and user-level local privacy - M
 arco Avella Medina (Columbia University)
DTSTART:20240202T140000Z
DTEND:20240202T150000Z
UID:TALK209545@talks.cam.ac.uk
CONTACT:Dr Sergio Bacallado
DESCRIPTION:We propose a general optimization-based framework for computin
 g differentially private M-estimators. We first show that robust statistic
 s can be used in conjunction with noisy gradient descent or noisy Newton m
 ethods in order to obtain optimal private estimators with global linear or
  quadratic convergence\, respectively. We establish local and global conve
 rgence guarantees\, under both local strong convexity and self-concordance
 \, showing that our private estimators converge with high probability to a
  small neighborhood of the nonprivate M-estimators. We then extend this op
 timization framework to the more restrictive setting of local differential
  privacy (LDP) where a group of users communicates with an untrusted centr
 al server. Contrary to most works that aim to protect a single data point\
 , here we assume each user possesses multiple data points and focus on use
 r-level privacy which aims to protect the entire set of data points belong
 ing to a user. Our main algorithm is a noisy gradient descent algorithm\, 
 combined with a user-level LDP mean estimation procedure to privately comp
 ute the average gradient across users at each step. We will highlight the 
 challenges associated with guaranteeing user-level LDP and present finite 
 sample global linear convergence guarantees for the iterates of our algori
 thm.
LOCATION:MR12\, Centre for Mathematical Sciences
END:VEVENT
END:VCALENDAR
