What is it about?
Differential privacy (1) is an increasingly popular tool for preserving individuals’ privacy by adding statistical uncertainty when sharing sensitive data. Its introduction into US Census Bureau operations (2), however, has been controversial. Scholars, politicians, and activists have raised concerns about the integrity of census-guided democratic processes, from redistricting to voting rights. The debate raises important issues, yet most analyses of trade-offs around differential privacy overlook deeper uncertainties in census data (3). To illustrate, we examine how education policies that leverage census data misallocate funding because of statistical uncertainty, comparing the impacts of quantified data error and of a possible differentially private mechanism. We find that misallocations due to our differentially private mechanism occur on the margin of much larger misallocations due to existing data error that particularly disadvantage marginalized groups. But, we also find that policy reforms can reduce the disparate impacts of both data error and privacy mechanisms.
Featured Image
Read the Original
This page is a summary of: Policy impacts of statistical uncertainty and privacy, Science, August 2022, American Association for the Advancement of Science,
DOI: 10.1126/science.abq4481.
You can read the full text:
Contributors
The following have contributed to this page