University of Cambridge > > Computer Laboratory Security Seminar > Analog vs. Digital Epsilons: Implementation Considerations for Differential Privacy

Analog vs. Digital Epsilons: Implementation Considerations for Differential Privacy

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Kieron Ivy Turk.

Differential privacy (DP) provides a rigorous framework for releasing data statistics while bounding information leakage. It is currently a de facto privacy framework that has received significant interest from the research community and has been deployed by the U.S. Census Bureau, Apple, Google, Microsoft, and others. However, DP analysis often assumes a perfect computing environment and building blocks such as random noise distribution samplers. Unfortunately, a naive implementation of DP mechanisms can invalidate their theoretical guarantees.

In this talk, I will highlight two attacks based on implementation flaws in the noise generation commonly used in DP systems: floating-point representation attack against continuous distributions and timing attacks against discrete distributions. I will then show that several state-of-the-art implementations of DP are susceptible to these attacks as they allow one to learn the values being protected by DP. Our evaluation demonstrates success rates of 92.56% for floating-point attacks in a machine learning setting and 99.65% for end-to-end timing attacks on private sum. I will conclude with suggested mitigations, emphasising that a careful implementation of DP systems may be as important as it is for cryptographic libraries.

The talk is based on joint work with Jiankai Jin (The University of Melbourne), Eleanor McMurtry (ETH Zurich) and Benjamin Rubinstein (The University of Melbourne), that appeared in IEEE Symposium on Security and Privacy 2022.

Bio: Olya Ohrimenko is an Associate Professor at The University of Melbourne which she joined in 2020. Prior to that she was a Principal Researcher at Microsoft Research in Cambridge, UK, where she started as a Postdoctoral Researcher in 2014. Her research interests include privacy and integrity of machine learning algorithms, data analysis tools and cloud computing, including topics such as differential privacy, verifiable and data-oblivious computation, trusted execution environments, side-channel attacks and mitigations. Recently Olya has worked with the Australian Bureau of Statistics and National Bank Australia. She has received solo and joint research grants from Facebook and Oracle and is currently a PI on an AUSMURI grant.

This talk is part of the Computer Laboratory Security Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity