Privacy Analysis of Online Learning Algorithms via Contraction Coefficients

Citation:

Shahab Asoodeh, Mario Diaz, and Flavio Calmon. Submitted. “Privacy Analysis of Online Learning Algorithms via Contraction Coefficients”.

Abstract:

We propose an information-theoretic technique for analyzing privacy guarantees of general online algorithms. Specifically, we demonstrate that differential privacy guarantees of iterative algorithms can be determined by a direct application of contraction coefficients derived from strong data processing inequalities for f-divergences. Our technique relies on generalizing the Dobrushin's contraction coefficient for total variation distance to an f-divergence known as E_\gamma-divergence that underlies the differential privacy. As an example, we apply our technique to derive the differential privacy parameters of online gradient descent algorithm. Moreover, we also show that this  framework can be tailored to  batch learning algorithms that
can be implemented with one pass over the training dataset.