## You are here

Inference and Privacy

Inference and Privacy

Differential privacy is a dominant standard for privacy-preserving data analysis, and requires that an algorithm is randomized “just enough” to mask the contribution of any one individual in the input data set when viewing the algorithm’s output. This leads to new inference problems where we wish to reason about the truth given measurements with excess noise added for privacy. I will discuss recent work from my research group on inference and differential privacy, including private Bayesian inference, private bootstrap methods, and probabilistic inference for reconstructing models of a data set from noisy measurements, which is a key step in successful approaches for answering database queries and generating synthetic data.

## Department of Mathematics and Statistics