Iterative Methods for Private Synthetic Data: Unifying Framework and New Methods [conference paper]

Conference

Thirty-fifth Conference on Neural Information Processing Systems (NeurIPS) - December 7-10, 2021

Authors

Terrance Liu, Giuseppe Vietri (Ph.D. student), Steven Wu (adjunct assistant professor)

Abstract

We study private synthetic data generation for query release, where the goal is to construct a sanitized version of a sensitive dataset, subject to differential privacy, that approximately preserves the answers to a large collection of statistical queries. We first present an algorithmic framework that unifies a long line of iterative algorithms in the literature. Under this framework, we propose two new methods. The first method, private entropy projection (PEP), can be viewed as an advanced variant of MWEM that adaptively reuses past query measurements to boost accuracy. Our second method, generative networks with the exponential mechanism (GEM), circumvents computational bottlenecks in algorithms such as MWEM and PEP by optimizing over generative models parameterized by neural networks, which capture a rich family of distributions while enabling fast gradient-based optimization. We demonstrate that PEP and GEM empirically outperform existing algorithms. Furthermore, we show that GEM nicely incorporates prior information from public data while overcoming limitations of PMW^Pub, the existing state-of-the-art method that also leverages public data.

Link to full paper

Iterative Methods for Private Synthetic Data: Unifying Framework and New Methods

Keywords

machine learning, deep learning, privacy

Share