Aarhus University Seal

Not All Noise Is Accounted Equally: How Differentially Private Learning Benefits From Large Sampling Rates

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review

Links

DOI

Learning often involves sensitive data and as such, privacy preserving extensions to Stochastic Gradient Descent (SGD) and other machine learning algorithms have been developed using the definitions of Differential Privacy (DP). In differentially private SGD, the gradients computed at each training iteration are subject to two different types of noise. Firstly, inherent sampling noise arising from the use of minibatches. Secondly, additive Gaussian noise from the underlying mechanisms that introduce privacy. In this study, we show that these two types of noise are equivalent in their effect on the utility of private neural networks, however they are not accounted for equally in the privacy budget. Given this observation, we propose a training paradigm that shifts the proportions of noise towards less inherent and more additive noise, such that more of the overall noise can be accounted for in the privacy budget. With this paradigm, we are able to improve on the state-of-The-Art in the privacy/utility tradeoff of private end-To-end CNNs.

Original languageEnglish
Title of host publication2021 IEEE 31st International Workshop on Machine Learning for Signal Processing, MLSP 2021
Number of pages6
PublisherIEEE
Publication year2021
ISBN (print)978-1-6654-1184-4
ISBN (electronic)978-1-7281-6338-3
DOIs
Publication statusPublished - 2021
EventIEEE International Workshop on MACHINE LEARNING FOR SIGNAL PROCESSING - Gold Coast, Australia
Duration: 25 Oct 202128 Oct 2021
https://2021.ieeemlsp.org/

Conference

ConferenceIEEE International Workshop on MACHINE LEARNING FOR SIGNAL PROCESSING
LandAustralia
ByGold Coast
Periode25/10/202128/10/2021
Internetadresse
SeriesIEEE Workshop on Machine Learning for Signal Processing
Volume2021
ISSN1551-2541

    Research areas

  • Deep Learning, Differential Privacy, Gradient Noise, Privacy, Stochastic Gradient Descent

See relations at Aarhus University Citationformats

ID: 221309076