Simpler PAC-Bayesian Bounds for Hostile Data
Document type :
Article dans une revue scientifique
Permalink :
Title :
Simpler PAC-Bayesian Bounds for Hostile Data
Author(s) :
Journal title :
Machine Learning
Volume number :
107
Pages :
887–902
Publisher :
Springer Verlag
Publication date :
2018
ISSN :
0885-6125
Keyword(s) :
F-divergence
Oracle inequalities
Dependent and unbounded data
PAC-Bayesian theory
Oracle inequalities
Dependent and unbounded data
PAC-Bayesian theory
HAL domain(s) :
Statistiques [stat]/Machine Learning [stat.ML]
English abstract : [en]
PAC-Bayesian learning bounds are of the utmost interest to the learning community. Their role is to connect the generalization ability of an aggregation distribution $\\rho$ to its empirical risk and to its Kullback-Leibler ...
Show more >PAC-Bayesian learning bounds are of the utmost interest to the learning community. Their role is to connect the generalization ability of an aggregation distribution $\\rho$ to its empirical risk and to its Kullback-Leibler divergence with respect to some prior distribution $\\pi$. Unfortunately, most of the available bounds typically rely on heavy assumptions such as boundedness and independence of the observations. This paper aims at relaxing these constraints and provides PAC-Bayesian learning bounds that hold for dependent, heavy-tailed observations (hereafter referred to as \\emph{hostile data}). In these bounds the Kullack-Leibler divergence is replaced with a general version of Csisz\\'ar's $f$-divergence. We prove a general PAC-Bayesian bound, and show how to use it in various hostile settings.Show less >
Show more >PAC-Bayesian learning bounds are of the utmost interest to the learning community. Their role is to connect the generalization ability of an aggregation distribution $\\rho$ to its empirical risk and to its Kullback-Leibler divergence with respect to some prior distribution $\\pi$. Unfortunately, most of the available bounds typically rely on heavy assumptions such as boundedness and independence of the observations. This paper aims at relaxing these constraints and provides PAC-Bayesian learning bounds that hold for dependent, heavy-tailed observations (hereafter referred to as \\emph{hostile data}). In these bounds the Kullack-Leibler divergence is replaced with a general version of Csisz\\'ar's $f$-divergence. We prove a general PAC-Bayesian bound, and show how to use it in various hostile settings.Show less >
Language :
Anglais
Audience :
Internationale
Popular science :
Non
Administrative institution(s) :
CNRS
Université de Lille
Université de Lille
Submission date :
2020-06-08T14:11:19Z
2020-06-09T09:17:15Z
2020-06-09T09:17:15Z
Files
- documen
- Open access
- Access the document