Statistically Valid Inferences from Privacy Protected Data (Deloitte)

Presentation Date: 

Thursday, July 14, 2022

Location: 

Deloitte

Presentation Slides: 

Venerable procedures used for privacy protection in sharing data within individual companies and governments, within academia, and between sectors have recently been proven massively inadequate (e.g., respondents in de-identified surveys can usually be re-identified). Furthermore, the benefits of getting our data sharing act together go far beyond the university, since unprecedented quantities of data that could help social scientists understand and ameliorate the challenges of human society are presently locked away inside companies, governments, and other organizations, in part because of worries about privacy violations. We address these problems with a general-purpose data access and analysis system with mathematical guarantees of privacy for individuals who may be represented in the data, statistical guarantees for researchers seeking insights from it, and protection for society from some fallacious scientific conclusions. We build on the standard of "differential privacy'' but, unlike most such approaches, we also correct for the serious statistical biases induced by privacy-preserving procedures, provide a proper accounting for statistical uncertainty, and impose minimal constraints on the choice of data analytic methods and types of quantities estimated. Our algorithm is easy to implement, simple to use, and computationally efficient; we also offer open source software to illustrate all our methods.

Based on papers (each joint with subsets of Georgie Evans, Meg Schwenzfeier, Adam Smith, and Abhradeep Thakurta) available at GaryKing.org/privacy.