Still no free lunches: the price to pay ...
Type de document :
Article dans une revue scientifique: Article original
DOI :
Titre :
Still no free lunches: the price to pay for tighter PAC-Bayes bounds
Auteur(s) :
Guedj, Benjamin [Auteur]
The Inria London Programme [Inria-London]
MOdel for Data Analysis and Learning [MODAL]
Inria-CWI [Inria-CWI]
Department of Computer science [University College of London] [UCL-CS]
University College of London [London] [UCL]
Pujol, Louis [Auteur]
Université Paris-Saclay

The Inria London Programme [Inria-London]
MOdel for Data Analysis and Learning [MODAL]
Inria-CWI [Inria-CWI]
Department of Computer science [University College of London] [UCL-CS]
University College of London [London] [UCL]
Pujol, Louis [Auteur]
Université Paris-Saclay
Titre de la revue :
Entropy
Éditeur :
MDPI
Date de publication :
2021
ISSN :
1099-4300
Discipline(s) HAL :
Statistiques [stat]/Machine Learning [stat.ML]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Théorie [stat.TH]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Théorie [stat.TH]
Résumé en anglais : [en]
"No free lunch" results state the impossibility of obtaining meaningful bounds on the error of a learning algorithm without prior assumptions and modelling. Some models are expensive (strong assumptions, such as as subgaussian ...
Lire la suite >"No free lunch" results state the impossibility of obtaining meaningful bounds on the error of a learning algorithm without prior assumptions and modelling. Some models are expensive (strong assumptions, such as as subgaussian tails), others are cheap (simply finite variance). As it is well known, the more you pay, the more you get: in other words, the most expensive models yield the more interesting bounds. Recent advances in robust statistics have investigated procedures to obtain tight bounds while keeping the cost minimal. The present paper explores and exhibits what the limits are for obtaining tight PAC-Bayes bounds in a robust setting for cheap models, addressing the question: is PAC-Bayes good value for money?Lire moins >
Lire la suite >"No free lunch" results state the impossibility of obtaining meaningful bounds on the error of a learning algorithm without prior assumptions and modelling. Some models are expensive (strong assumptions, such as as subgaussian tails), others are cheap (simply finite variance). As it is well known, the more you pay, the more you get: in other words, the most expensive models yield the more interesting bounds. Recent advances in robust statistics have investigated procedures to obtain tight bounds while keeping the cost minimal. The present paper explores and exhibits what the limits are for obtaining tight PAC-Bayes bounds in a robust setting for cheap models, addressing the question: is PAC-Bayes good value for money?Lire moins >
Langue :
Anglais
Comité de lecture :
Oui
Audience :
Internationale
Vulgarisation :
Non
Projet ANR :
Collections :
Source :
Fichiers
- document
- Accès libre
- Accéder au document
- 1910.04460.pdf
- Accès libre
- Accéder au document
- 1910.04460
- Accès libre
- Accéder au document