Marginal Release Under Local Differential Privacy
Citations Over TimeTop 10% of 2018 papers
Abstract
Many analysis and machine learning tasks require the availability of marginal statistics on multidimensional datasets while providing strong privacy guarantees for the data subjects. Applications for these statistics range from finding correlations in the data to fitting sophisticated prediction models. In this paper, we provide a set of algorithms for materializing marginal statistics under the strong model of local differential privacy. We prove the first tight theoretical bounds on the accuracy of marginals compiled under each approach, perform empirical evaluation to confirm these bounds, and evaluate them for tasks such as modeling and correlation testing. Our results show that releasing information based on (local) Fourier transformations of the input is preferable to alternatives based directly on (local) marginals.
Related Papers
- → Differential Private Noise Adding Mechanism and Its Application on Consensus Algorithm(2020)93 cited
- → Programming language techniques for differential privacy(2016)37 cited
- → Differential private noise adding mechanism: Basic conditions and its application(2017)35 cited
- → Differential Private Noise Adding Mechanism and Its Application on Consensus(2016)4 cited
- Differential Private Noise Adding Mechanism: Fundamental Theory and its Application.(2016)