Still no free lunches: the price to pay ...
Document type :
Article dans une revue scientifique: Article original
DOI :
Title :
Still no free lunches: the price to pay for tighter PAC-Bayes bounds
Author(s) :
Guedj, Benjamin [Auteur]
The Inria London Programme [Inria-London]
MOdel for Data Analysis and Learning [MODAL]
Inria-CWI [Inria-CWI]
Department of Computer science [University College of London] [UCL-CS]
University College of London [London] [UCL]
Pujol, Louis [Auteur]
Université Paris-Saclay

The Inria London Programme [Inria-London]
MOdel for Data Analysis and Learning [MODAL]
Inria-CWI [Inria-CWI]
Department of Computer science [University College of London] [UCL-CS]
University College of London [London] [UCL]
Pujol, Louis [Auteur]
Université Paris-Saclay
Journal title :
Entropy
Publisher :
MDPI
Publication date :
2021
ISSN :
1099-4300
HAL domain(s) :
Statistiques [stat]/Machine Learning [stat.ML]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Théorie [stat.TH]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Théorie [stat.TH]
English abstract : [en]
"No free lunch" results state the impossibility of obtaining meaningful bounds on the error of a learning algorithm without prior assumptions and modelling. Some models are expensive (strong assumptions, such as as subgaussian ...
Show more >"No free lunch" results state the impossibility of obtaining meaningful bounds on the error of a learning algorithm without prior assumptions and modelling. Some models are expensive (strong assumptions, such as as subgaussian tails), others are cheap (simply finite variance). As it is well known, the more you pay, the more you get: in other words, the most expensive models yield the more interesting bounds. Recent advances in robust statistics have investigated procedures to obtain tight bounds while keeping the cost minimal. The present paper explores and exhibits what the limits are for obtaining tight PAC-Bayes bounds in a robust setting for cheap models, addressing the question: is PAC-Bayes good value for money?Show less >
Show more >"No free lunch" results state the impossibility of obtaining meaningful bounds on the error of a learning algorithm without prior assumptions and modelling. Some models are expensive (strong assumptions, such as as subgaussian tails), others are cheap (simply finite variance). As it is well known, the more you pay, the more you get: in other words, the most expensive models yield the more interesting bounds. Recent advances in robust statistics have investigated procedures to obtain tight bounds while keeping the cost minimal. The present paper explores and exhibits what the limits are for obtaining tight PAC-Bayes bounds in a robust setting for cheap models, addressing the question: is PAC-Bayes good value for money?Show less >
Language :
Anglais
Peer reviewed article :
Oui
Audience :
Internationale
Popular science :
Non
ANR Project :
Collections :
Source :
Files
- document
- Open access
- Access the document
- 1910.04460.pdf
- Open access
- Access the document
- 1910.04460
- Open access
- Access the document