Brief is filed in support of the Department of Commerce’s use of differential privacy to protect the confidentiality of 2020 census data
Twenty leading experts in data privacy and cryptography, including Deirdre Mulligan, Professor and Co-Director, Algorithmic Fairness & Opacity Group at the School of Information, and I School Ph.D. student Nitin Kohli, filed an amicus brief in support of the Census Bureau’s use of “differential privacy” — a mathematically rigorous way of providing provably and future-proof privacy-infused statistics — to protect the privacy of census respondents.
Through the lawsuit, the state of Alabama seeks to force the Census Bureau to use outdated methods for protecting confidentiality — methods which a team of Census Bureau researchers found could be easily broken by deploying modern data reconstruction techniques to published Census statistics. The experts, who include inventors of differential privacy, cryptographers, statisticians, legal experts focused on technology and society document the increased risks of attacks due to the availability of large data sets and computing power available to adversaries, and explain that differential privacy is the only known method capable of preventing such attacks while simultaneously enabling the publication of useful statistics.
“This case is about the capacity of the Census Bureau to honor its confidentiality commitment in light of new and evolving threats,” said Professor Mulligan. “The Census Bureau’s protections must be robust against today’s and tomorrow’s threats. Doing so requires protections that reflect scientific advances.”
The risk of re-identifying census respondents from the release of basic statistical tables, like those the Census Bureau releases, is real and growing and can impact tens of millions of Americans. The Census Bureau’s research — as well as extensive academic research — shows that data reconstruction and re-identification attacks in which an attacker first reconstructs person-level data records from products based on aggregated personal data, and then, re-identifies the reconstructed records are increasingly easy. Data releases protected by traditional statistical disclosure limitation techniques, like those the state of Alabama wants the Census Bureau to use, are vulnerable to these attacks.
“Reconstruction attacks on the census data pose a clear threat to public participation, and the only known way to thwart these attacks while still publishing information is to leverage algorithms that satisfy differential privacy,” said Nitin Kohli, whose doctoral research focuses on differential privacy as a means to attend to a broad class of social and political commitments. “Assisting in this brief is an example of the public interest technology work that I've been able to engage in at the School of Information.”
“To deliver robust privacy protections requires differential privacy implementation choices that reflect the real risks,” said Professor Mulligan. “Privacy isn’t free, and protecting it will, as it always has, reduce the accuracy of fine-grained statistics, however that reduction is the price we pay for ongoing robust public participation in the census.”
A consensus study report published by the National Academies of Sciences, Engineering, and Medicine in 2017 concluded that traditional statistical disclosure methods “are increasingly susceptible to privacy breaches given the proliferation of external data sources and the availability of high-powered computing that could enable inferences about people or entities in a dataset, re-identification of specific people or entities, and even reconstruction of the original data.”
“The Census Bureau — like other statistical agencies — must adopt protections to fit changing threats,” said Cynthia Dwork, Gordon McKay Professor of Computer Science, Harvard University. “Thanks to fifteen years of research on differential privacy, the Census Bureau has the tools to meet its obligations to both provide useful statistical data and provide future-proof protection of privacy.”
According to Dwork: “The ‘privacy vs accuracy’ argument does not hold: Poor privacy now guarantees poor participation in the future.”
“The data privacy experts filing today’s brief should be lauded for sharing their insights on differential privacy with the Court and the public,” said Michael B. Jones of Bondurant Mixson & Elmore LLP, the attorney for the experts. “The Census Bureau’s disclosure avoidance system is essential to protecting the privacy of the millions of people who responded to the 2020 Census.”
Deirdre K. Mulligan is a Professor in the School of Information at UC Berkeley, a faculty Director of the Berkeley Center for Law & Technology, a co-organizer of the Algorithmic Fairness & Opacity Working Group, an affiliated faculty on the Hewlett funded Berkeley Center for Long-Term Cybersecurity, and a faculty advisor to the Center for Technology, Society & Policy. Mulligan’s research explores legal and technical means of protecting values such as privacy, freedom of expression, and fairness in emerging technical systems.
Nitin Kohli is a Ph.D. student in the School of Information at UC Berkeley, and alumni (MIDS 2015). Nitin researches topics that span privacy and fairness. Drawing upon the combination of technical, legal, and social science scholarship, Nitin develops theory, tools, and frameworks that safeguard individuals while attending to the social and political context of their use.