Journal article
Differential privacy for Bayesian inference through posterior sampling
C Dimitrakakis, B Nelson, Z Zhang, A Mitrokotsa, BIP Rubinstein
Journal of Machine Learning Research | MICROTOME PUBL | Published : 2017
Abstract
Differential privacy formalises privacy-preserving mechanisms that provide access to a database. Can Bayesian inference be used directly to provide private access to data? The answer is yes: under certain conditions on the prior, sampling from the posterior distribution can lead to a desired level of privacy and utility. For a uniform treatment, we define differential privacy over arbitrary data set metrics, outcome spaces and distribution families. This allows us to also deal with non-i.i.d or non-tabular data sets. We then prove bounds on the sensitivity of the posterior to the data, which delivers a measure of robustness. We also show how to use posterior sampling to provide differentiall..
View full abstractRelated Projects (1)
Grants
Awarded by Marie Curie Project "Efficient Sequential Decision Making Under Uncertainty"
Awarded by People Programme (Marie Curie Actions) of the European Union's Seventh Framework Programme under REA
Awarded by Australian Research Council
Funding Acknowledgements
We gratefully thank Aaron Roth, Kamalika Chaudhuri, and Matthias Bussas for their discussion and insights as well as the anonymous reviewers for their comments on the paper, which helped to improve it significantly. This work was partially supported by the Marie Curie Project "Efficient Sequential Decision Making Under Uncertainty", Grant Number 237816; the People Programme (Marie Curie Actions) of the European Union's Seventh Framework Programme (FP7/2007-2013) under REA grant agreement n 608743; the SNSF Project, "SwissSenseSynergia"; and the Australian Research Council (DE160100584).