Guy Rothblum Princeton University |
Consider a database of sensitive information about a set of participants. Statistical analysis of the data may yield valuable results, but it also poses serious threats to the participants' privacy. A successful research program has, in the last few years, attempted to address these conflicting concerns. This line of work formulated the rigorous privacy guarantee of differential privacy [Dwork McSherry Nissim and Smith '06] and showed that in some cases data analyses can provide accurate answers while protecting participants' privacy.
After reviewing some of this past work, I will introduce two new general-purpose tools for privacy-preserving data analysis: 1. A new "boosting" framework for improving the accuracy guarantees of weak differentially private algorithms. 2. Robust privacy guarantees for differentially private algorithms under composition.
Using these tools we will show that, computational complexity aside, differential privacy permits surprisingly rich and accurate data analyses. I will then highlight some of the intriguing challenges that remain open for future work in this field.
No prior knowledge will be assumed. |
4:15pm B17 Upson Hall Thursday, March 17, 2011 Refreshments at 3:45pm in the Upson 4th Floor Atrium |
Computer Science Colloquium Spring 2011 |
www.cs.cornell.edu/events/colloquium |
Differential Privacy: Recent Developments and Future Challenges |