Monte Carlo with kernel-based Gibbs measures: ...
Document type :
Pré-publication ou Document de travail
Title :
Monte Carlo with kernel-based Gibbs measures: Guarantees for probabilistic herding
Author(s) :
Rouault, Martin [Auteur correspondant]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Bardenet, Remi [Auteur]
Maïda, Mylène [Auteur]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Bardenet, Remi [Auteur]
Maïda, Mylène [Auteur]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Publication date :
2024-02-20
English keyword(s) :
Monte Carlo integration
Interacting particle systems
Concentration inequality
Interacting particle systems
Concentration inequality
HAL domain(s) :
Informatique [cs]/Apprentissage [cs.LG]
Mathématiques [math]/Probabilités [math.PR]
Statistiques [stat]/Machine Learning [stat.ML]
Mathématiques [math]/Probabilités [math.PR]
Statistiques [stat]/Machine Learning [stat.ML]
English abstract : [en]
Kernel herding belongs to a family of deterministic quadratures that seek to minimize the worst-case integration error over a reproducing kernel Hilbert space (RKHS). In spite of strong experimental support, it has revealed ...
Show more >Kernel herding belongs to a family of deterministic quadratures that seek to minimize the worst-case integration error over a reproducing kernel Hilbert space (RKHS). In spite of strong experimental support, it has revealed difficult to prove that this worst-case error decreases at a faster rate than the standard square root of the number of quadrature nodes, at least in the usual case where the RKHS is infinite-dimensional. In this theoretical paper, we study a joint probability distribution over quadrature nodes, whose support tends to minimize the same worst-case error as kernel herding. We prove that it does outperform i.i.d. Monte Carlo, in the sense of coming with a tighter concentration inequality on the worst-case integration error. While not improving the rate yet, this demonstrates that the mathematical tools of the study of Gibbs measures can help understand to what extent kernel herding and its variants improve on computationally cheaper methods. Moreover, we provide early experimental evidence that a faster rate of convergence, though not worst-case, is likely.Show less >
Show more >Kernel herding belongs to a family of deterministic quadratures that seek to minimize the worst-case integration error over a reproducing kernel Hilbert space (RKHS). In spite of strong experimental support, it has revealed difficult to prove that this worst-case error decreases at a faster rate than the standard square root of the number of quadrature nodes, at least in the usual case where the RKHS is infinite-dimensional. In this theoretical paper, we study a joint probability distribution over quadrature nodes, whose support tends to minimize the same worst-case error as kernel herding. We prove that it does outperform i.i.d. Monte Carlo, in the sense of coming with a tighter concentration inequality on the worst-case integration error. While not improving the rate yet, this demonstrates that the mathematical tools of the study of Gibbs measures can help understand to what extent kernel herding and its variants improve on computationally cheaper methods. Moreover, we provide early experimental evidence that a faster rate of convergence, though not worst-case, is likely.Show less >
Language :
Anglais
ANR Project :
Collections :
Source :
Files
- document
- Open access
- Access the document
- 2402.11736.pdf
- Open access
- Access the document
- 2402.11736
- Open access
- Access the document