Monte Carlo with kernel-based Gibbs measures: ...
Type de document :
Pré-publication ou Document de travail
Titre :
Monte Carlo with kernel-based Gibbs measures: Guarantees for probabilistic herding
Auteur(s) :
Rouault, Martin [Auteur correspondant]
Bardenet, Remi [Auteur]
Maïda, Mylène [Auteur]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Bardenet, Remi [Auteur]

Maïda, Mylène [Auteur]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Date de publication :
2024-02-20
Mot(s)-clé(s) en anglais :
Monte Carlo integration
Interacting particle systems
Concentration inequality
Interacting particle systems
Concentration inequality
Discipline(s) HAL :
Informatique [cs]/Apprentissage [cs.LG]
Mathématiques [math]/Probabilités [math.PR]
Statistiques [stat]/Machine Learning [stat.ML]
Mathématiques [math]/Probabilités [math.PR]
Statistiques [stat]/Machine Learning [stat.ML]
Résumé en anglais : [en]
Kernel herding belongs to a family of deterministic quadratures that seek to minimize the worst-case integration error over a reproducing kernel Hilbert space (RKHS). In spite of strong experimental support, it has revealed ...
Lire la suite >Kernel herding belongs to a family of deterministic quadratures that seek to minimize the worst-case integration error over a reproducing kernel Hilbert space (RKHS). In spite of strong experimental support, it has revealed difficult to prove that this worst-case error decreases at a faster rate than the standard square root of the number of quadrature nodes, at least in the usual case where the RKHS is infinite-dimensional. In this theoretical paper, we study a joint probability distribution over quadrature nodes, whose support tends to minimize the same worst-case error as kernel herding. We prove that it does outperform i.i.d. Monte Carlo, in the sense of coming with a tighter concentration inequality on the worst-case integration error. While not improving the rate yet, this demonstrates that the mathematical tools of the study of Gibbs measures can help understand to what extent kernel herding and its variants improve on computationally cheaper methods. Moreover, we provide early experimental evidence that a faster rate of convergence, though not worst-case, is likely.Lire moins >
Lire la suite >Kernel herding belongs to a family of deterministic quadratures that seek to minimize the worst-case integration error over a reproducing kernel Hilbert space (RKHS). In spite of strong experimental support, it has revealed difficult to prove that this worst-case error decreases at a faster rate than the standard square root of the number of quadrature nodes, at least in the usual case where the RKHS is infinite-dimensional. In this theoretical paper, we study a joint probability distribution over quadrature nodes, whose support tends to minimize the same worst-case error as kernel herding. We prove that it does outperform i.i.d. Monte Carlo, in the sense of coming with a tighter concentration inequality on the worst-case integration error. While not improving the rate yet, this demonstrates that the mathematical tools of the study of Gibbs measures can help understand to what extent kernel herding and its variants improve on computationally cheaper methods. Moreover, we provide early experimental evidence that a faster rate of convergence, though not worst-case, is likely.Lire moins >
Langue :
Anglais
Projet ANR :
Collections :
Source :
Fichiers
- document
- Accès libre
- Accéder au document
- 2402.11736.pdf
- Accès libre
- Accéder au document
- 2402.11736
- Accès libre
- Accéder au document