Differential Privacy: An Economic Method for Choosing Epsilon
Citations Over TimeTop 1% of 2014 papers
Abstract
Differential privacy is becoming a gold standard notion of privacy; it offers a guaranteed bound on loss of privacy due to release of query results, even under worst-case assumptions. The theory of differential privacy is an active research area, and there are now differentially private algorithms for a wide range of problems. However, the question of when differential privacy works in practice has received relatively little attention. In particular, there is still no rigorous method for choosing the key parameter ε, which controls the crucial tradeoff between the strength of the privacy guarantee and the accuracy of the published results. In this paper, we examine the role of these parameters in concrete applications, identifying the key considerations that must be addressed when choosing specific values. This choice requires balancing the interests of two parties with conflicting objectives: the data analyst, who wishes to learn something abou the data, and the prospective participant, who must decide whether to allow their data to be included in the analysis. We propose a simple model that expresses this balance as formulas over a handful of parameters, and we use our model to choose ε on a series of simple statistical studies. We also explore a surprising insight: in some circumstances, a differentially private study can be more accurate than a non-private study for the same cost, under our model. Finally, we discuss the simplifying assumptions in our model and outline a research agenda for possible refinements.
Related Papers
- → Differential Private Noise Adding Mechanism and Its Application on Consensus Algorithm(2020)93 cited
- → Programming language techniques for differential privacy(2016)37 cited
- → Differential private noise adding mechanism: Basic conditions and its application(2017)35 cited
- → Differential Private Noise Adding Mechanism and Its Application on Consensus(2016)4 cited
- Differential Private Noise Adding Mechanism: Fundamental Theory and its Application.(2016)