A PAC-Bayes bound for deterministic classifiers
Type de document :
Pré-publication ou Document de travail
Titre :
A PAC-Bayes bound for deterministic classifiers
Auteur(s) :
Clerico, Eugenio [Auteur]
University of Oxford
Deligiannidis, George [Auteur]
University of Oxford
Guedj, Benjamin [Auteur]
University College of London [London] [UCL]
Department of Computer science [University College of London] [UCL-CS]
The Inria London Programme [Inria-London]
MOdel for Data Analysis and Learning [MODAL]
Doucet, Arnaud [Auteur]
University of Oxford
University of Oxford
Deligiannidis, George [Auteur]
University of Oxford
Guedj, Benjamin [Auteur]
University College of London [London] [UCL]
Department of Computer science [University College of London] [UCL-CS]
The Inria London Programme [Inria-London]
MOdel for Data Analysis and Learning [MODAL]
Doucet, Arnaud [Auteur]
University of Oxford
Discipline(s) HAL :
Statistiques [stat]/Machine Learning [stat.ML]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Théorie [stat.TH]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Théorie [stat.TH]
Résumé en anglais : [en]
We establish a disintegrated PAC-Bayesian bound, for classifiers that are trained via continuous-time (non-stochastic) gradient descent. Contrarily to what is standard in the PAC-Bayesian setting, our result applies to a ...
Lire la suite >We establish a disintegrated PAC-Bayesian bound, for classifiers that are trained via continuous-time (non-stochastic) gradient descent. Contrarily to what is standard in the PAC-Bayesian setting, our result applies to a training algorithm that is deterministic, conditioned on a random initialisation, without requiring any $\textit{de-randomisation}$ step. We provide a broad discussion of the main features of the bound that we propose, and we study analytically and empirically its behaviour on linear models, finding promising results.Lire moins >
Lire la suite >We establish a disintegrated PAC-Bayesian bound, for classifiers that are trained via continuous-time (non-stochastic) gradient descent. Contrarily to what is standard in the PAC-Bayesian setting, our result applies to a training algorithm that is deterministic, conditioned on a random initialisation, without requiring any $\textit{de-randomisation}$ step. We provide a broad discussion of the main features of the bound that we propose, and we study analytically and empirically its behaviour on linear models, finding promising results.Lire moins >
Langue :
Anglais
Collections :
Source :
Fichiers
- document
- Accès libre
- Accéder au document
- 2209.02525.pdf
- Accès libre
- Accéder au document
- 2209.02525
- Accès libre
- Accéder au document