Talks

  1. "Information Theory for Responsbile Machine Learning", The Mathematical Institute, Leiden University, 2021.
  2. "Local Differential Pricacy is Equivalet to the Contraction of Hockey-Stick Divergence", Privacy Tools Meeting, Harvard University, January 2021. [Invited talk] [slides]
  3. "Privacy Analysis of Iterative Algorithms via f-divergences", at Google Research, January 2021. [Invited talk] [slides]
  4. "Three Variants of Differential Privacy: Lossless Conversion and Applications" at Theory and Practice of Differential Privacy (TPDP), 2020.
  5. "Differentially private federated learning: An information-theoretic perspective", at International Workshop on Federated Learning for User Privacy and Data Confidentiality (FL-ICML), 2020.
  6. "Contraction coefficients of Markov kernels with applications in privacy amplification", at Information Theory and Application (ITA), 2020. [Invited talk]
  7. "A better bound gives a hundred rounds: Enhanced privacy guarantees via f-divergences": at Privacy Tools Meeting, Harvard University, 2019.