University of Cambridge > > Statistics > M-estimation, noisy optimization and user-level local privacy

M-estimation, noisy optimization and user-level local privacy

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Dr Sergio Bacallado.

We propose a general optimization-based framework for computing differentially private M-estimators. We first show that robust statistics can be used in conjunction with noisy gradient descent or noisy Newton methods in order to obtain optimal private estimators with global linear or quadratic convergence, respectively. We establish local and global convergence guarantees, under both local strong convexity and self-concordance, showing that our private estimators converge with high probability to a small neighborhood of the nonprivate M-estimators. We then extend this optimization framework to the more restrictive setting of local differential privacy (LDP) where a group of users communicates with an untrusted central server. Contrary to most works that aim to protect a single data point, here we assume each user possesses multiple data points and focus on user-level privacy which aims to protect the entire set of data points belonging to a user. Our main algorithm is a noisy gradient descent algorithm, combined with a user-level LDP mean estimation procedure to privately compute the average gradient across users at each step. We will highlight the challenges associated with guaranteeing user-level LDP and present finite sample global linear convergence guarantees for the iterates of our algorithm.

This talk is part of the Statistics series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity