Verifiable Differential Privacy

Ariel Feldman (University of Chicago)




Abstract: Working with sensitive data is often a balancing act between privacy and integrity concerns. Consider, for instance, a medical researcher who has analyzed a patient database to judge the effectiveness of a new treatment and would now like to publish her findings. On the one hand, the patients may be concerned that the researcher's results contain too much information and accidentally leak some private fact about themselves; on the other hand, the readers of the published study may be concerned that the results contain too little information, limiting their ability to detect errors in the calculations or flaws in the methodology.
This talk presents VerDP, a system for private data analysis that provides both strong integrity and strong differential privacy guarantees. VerDP accepts queries that are written in a special query language, and it processes them only if a) it can certify them as differentially private, and if b) it can prove the integrity of the result in zero knowledge. Our experimental evaluation shows that VerDP can successfully process several different queries from the differential privacy literature, and that the cost of generating and verifying the proofs is practical.

Bio: Ariel Feldman is an Assistant Professor of Computer Science at the University of Chicago. His research lies at the intersection of computer security and distributed systems. He is presently focused on finding new ways to protect the security and privacy of users of “cloud hosted” services. His interests also include software and network security, data privacy, anonymity, and electronic voting, as well as the interaction between computer security, law, and public policy. Previously, he was a postdoctoral researcher at the CIS department at the University of Pennsylvania, and he received his Ph.D. in Computer Science from Princeton University in 2012.