Differentially Private Coordinate Descent ...
Document type :
Communication dans un congrès avec actes
Title :
Differentially Private Coordinate Descent for Composite Empirical Risk Minimization
Author(s) :
Mangold, Paul [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Machine Learning in Information Networks [MAGNET]
Bellet, Aurelien [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Machine Learning in Information Networks [MAGNET]
Salmon, Joseph [Auteur]
Scientific Data Management [ZENITH]
Institut universitaire de France [IUF]
Institut Montpelliérain Alexander Grothendieck [IMAG]
Tommasi, Marc [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Machine Learning in Information Networks [MAGNET]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Machine Learning in Information Networks [MAGNET]
Bellet, Aurelien [Auteur]

Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Machine Learning in Information Networks [MAGNET]
Salmon, Joseph [Auteur]
Scientific Data Management [ZENITH]
Institut universitaire de France [IUF]
Institut Montpelliérain Alexander Grothendieck [IMAG]
Tommasi, Marc [Auteur]

Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Machine Learning in Information Networks [MAGNET]
Scientific editor(s) :
Kamalika Chaudhuri
Stefanie Jegelka
Le Song
Csaba Szepesvari
Gang Niu
Sivan Sabato
Stefanie Jegelka
Le Song
Csaba Szepesvari
Gang Niu
Sivan Sabato
Conference title :
ICML 2022 - 39th International Conference on Machine Learning
City :
Baltimore
Country :
Etats-Unis d'Amérique
Start date of the conference :
2022-07-17
Book title :
Proceedings of the 39th International Conference on Machine Learning
Publisher :
PMLR
Publication date :
2022
HAL domain(s) :
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Machine Learning [stat.ML]
Statistiques [stat]/Machine Learning [stat.ML]
English abstract : [en]
Machine learning models can leak information about the data used to train them. To mitigate this issue, Differentially Private (DP) variants of optimization algorithms like Stochastic Gradient Descent (DP-SGD) have been ...
Show more >Machine learning models can leak information about the data used to train them. To mitigate this issue, Differentially Private (DP) variants of optimization algorithms like Stochastic Gradient Descent (DP-SGD) have been designed to trade-off utility for privacy in Empirical Risk Minimization (ERM) problems. In this paper, we propose Differentially Private proximal Coordinate Descent (DP-CD), a new method to solve composite DP-ERM problems. We derive utility guarantees through a novel theoretical analysis of inexact coordinate descent. Our results show that, thanks to larger step sizes, DP-CD can exploit imbalance in gradient coordinates to outperform DP-SGD. We also prove new lower bounds for composite DP-ERM under coordinate-wise regularity assumptions, that are nearly matched by DP-CD. For practical implementations, we propose to clip gradients using coordinate-wise thresholds that emerge from our theory, avoiding costly hyperparameter tuning. Experiments on real and synthetic data support our results, and show that DP-CD compares favorably with DP-SGD.Show less >
Show more >Machine learning models can leak information about the data used to train them. To mitigate this issue, Differentially Private (DP) variants of optimization algorithms like Stochastic Gradient Descent (DP-SGD) have been designed to trade-off utility for privacy in Empirical Risk Minimization (ERM) problems. In this paper, we propose Differentially Private proximal Coordinate Descent (DP-CD), a new method to solve composite DP-ERM problems. We derive utility guarantees through a novel theoretical analysis of inexact coordinate descent. Our results show that, thanks to larger step sizes, DP-CD can exploit imbalance in gradient coordinates to outperform DP-SGD. We also prove new lower bounds for composite DP-ERM under coordinate-wise regularity assumptions, that are nearly matched by DP-CD. For practical implementations, we propose to clip gradients using coordinate-wise thresholds that emerge from our theory, avoiding costly hyperparameter tuning. Experiments on real and synthetic data support our results, and show that DP-CD compares favorably with DP-SGD.Show less >
Language :
Anglais
Peer reviewed article :
Oui
Audience :
Internationale
Popular science :
Non
ANR Project :
Collections :
Source :
Files
- https://hal.inria.fr/hal-03424974v3/document
- Open access
- Access the document
- https://hal.inria.fr/hal-03424974v3/document
- Open access
- Access the document
- http://arxiv.org/pdf/2110.11688
- Open access
- Access the document
- document
- Open access
- Access the document
- paper.pdf
- Open access
- Access the document
- 2110.11688
- Open access
- Access the document