Differential privacy and social science: An urgent puzzle
Oberski, Daniel L.
;
Kreuter, Frauke
DOI:
|
https://doi.org/10.1162/99608f92.63a22079
|
URL:
|
https://madoc.bib.uni-mannheim.de/58666
|
Additional URL:
|
https://hdsr.mitpress.mit.edu/pub/g9o4z8au/release...
|
URN:
|
urn:nbn:de:bsz:180-madoc-586666
|
Document Type:
|
Article
|
Year of publication:
|
2020
|
The title of a journal, publication series:
|
Harvard Data Science Review : HDSR
|
Volume:
|
2
|
Issue number:
|
1
|
Page range:
|
1-21
|
Place of publication:
|
Cambridge, MA
|
Publishing house:
|
MIT Press
|
ISSN:
|
2644-2353
|
Related URLs:
|
|
Publication language:
|
English
|
Institution:
|
Außerfakultäre Einrichtungen > Mannheim Centre for European Social Research - Research Department A School of Social Sciences > Statistik u. Sozialwissenschaftliche Methodenlehre (Kreuter 2014-2020)
|
Pre-existing license:
|
Creative Commons Attribution 4.0 International (CC BY 4.0)
|
Subject:
|
300 Social sciences, sociology, anthropology
|
Abstract:
|
Accessing and combining large amounts of data is important for quantitative social scientists, but increasing amounts of data also increase privacy risks. To mitigate these risks, important players in official statistics, academia, and business see a solution in the concept of differential privacy. In this opinion piece, we ask how differential privacy can benefit from social-scientific insights, and, conversely, how differential privacy is likely to transform social science. First, we put differential privacy in the larger context of social science. We argue that the discussion on implementing differential privacy has been clouded by incompatible subjective beliefs about risk, each perspective having merit for different data types. Moreover, we point out existing social-scientific insights that suggest limitations to the premises of differential privacy as a data protection approach. Second, we examine the likely consequences for social science if differential privacy is widely implemented. Clearly, workflows must change, and common social science data collection will become more costly. However, in addition to data protection, differential privacy may bring other positive side effects. These could solve some issues social scientists currently struggle with, such as p-hacking, data peeking, or overfitting; after all, differential privacy is basically a robust method to analyze data. We conclude that, in the discussion around privacy risks and data protection, a large number of disciplines must band together to solve this urgent puzzle of our time, including social science, computer science, ethics, law, and statistics, as well as public and private policy.
|
Additional information:
|
Online-Ressource
|
 | Dieser Eintrag ist Teil der Universitätsbibliographie. |
 | Das Dokument wird vom Publikationsserver der Universitätsbibliothek Mannheim bereitgestellt. |
Search Authors in
You have found an error? Please let us know about your desired correction here: E-Mail
Actions (login required)
 |
Show item |
|