Learning via Surrogate PAC-Bayes
Document type :
Communication dans un congrès avec actes
Title :
Learning via Surrogate PAC-Bayes
Author(s) :
Picard-Weibel, Antoine [Auteur correspondant]
MOdel for Data Analysis and Learning [MODAL]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Centre International de Recherche Sur l'Eau et l'Environnement [Suez] [CIRSEE]
Moscoviz, Roman [Auteur]
Centre International de Recherche Sur l'Eau et l'Environnement [Suez] [CIRSEE]
Guedj, Benjamin [Auteur]
MOdel for Data Analysis and Learning [MODAL]
The Inria London Programme [Inria-London]
The Alan Turing Institute
Inria Lille - Nord Europe
Institut National de Recherche en Informatique et en Automatique [Inria]
Department of Computer science [University College of London] [UCL-CS]
University College of London [London] [UCL]
MOdel for Data Analysis and Learning [MODAL]
Laboratoire Paul Painlevé - UMR 8524 [LPP]
Centre International de Recherche Sur l'Eau et l'Environnement [Suez] [CIRSEE]
Moscoviz, Roman [Auteur]
Centre International de Recherche Sur l'Eau et l'Environnement [Suez] [CIRSEE]
Guedj, Benjamin [Auteur]

MOdel for Data Analysis and Learning [MODAL]
The Inria London Programme [Inria-London]
The Alan Turing Institute
Inria Lille - Nord Europe
Institut National de Recherche en Informatique et en Automatique [Inria]
Department of Computer science [University College of London] [UCL-CS]
University College of London [London] [UCL]
Conference title :
NeurIPS 2024
City :
Vancouver
Country :
Canada
Start date of the conference :
2024-12-09
Publication date :
2024-10-11
English keyword(s) :
Meta Learning
Computational cost reduction
Phyiscal model
PAC-Bayes
Optimisation
Computational cost reduction
Phyiscal model
PAC-Bayes
Optimisation
HAL domain(s) :
Statistiques [stat]/Machine Learning [stat.ML]
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Apprentissage [cs.LG]
English abstract : [en]
PAC-Bayes learning is a comprehensive setting for (i) studying the generalisation ability of learning algorithms and (ii) deriving new learning algorithms by optimising a generalisation bound. However, optimising generalisation ...
Show more >PAC-Bayes learning is a comprehensive setting for (i) studying the generalisation ability of learning algorithms and (ii) deriving new learning algorithms by optimising a generalisation bound. However, optimising generalisation bounds might not always be viable for tractable or computational reasons, or both. For example, iteratively querying the empirical risk might prove computationally expensive. In response, we introduce a novel principled strategy for building an iterative learning algorithm via the optimisation of a sequence of surrogate training objectives, inherited from PAC-Bayes generalisation bounds. The key argument is to replace the empirical risk (seen as a function of hypotheses) in the generalisation bound by its projection onto a constructible low dimensional functional space: these projections can be queried much more efficiently than the initial risk. On top of providing that generic recipe for learning via surrogate PAC-Bayes bounds, we (i) contribute theoretical results establishing that iteratively optimising our surrogates implies the optimisation of the original generalisation bounds, (ii) instantiate this strategy to the framework of meta-learning, introducing a meta-objective offering a closed form expression for meta-gradient, (iii) illustrate our approach with numerical experiments inspired by an industrial biochemical problem.Show less >
Show more >PAC-Bayes learning is a comprehensive setting for (i) studying the generalisation ability of learning algorithms and (ii) deriving new learning algorithms by optimising a generalisation bound. However, optimising generalisation bounds might not always be viable for tractable or computational reasons, or both. For example, iteratively querying the empirical risk might prove computationally expensive. In response, we introduce a novel principled strategy for building an iterative learning algorithm via the optimisation of a sequence of surrogate training objectives, inherited from PAC-Bayes generalisation bounds. The key argument is to replace the empirical risk (seen as a function of hypotheses) in the generalisation bound by its projection onto a constructible low dimensional functional space: these projections can be queried much more efficiently than the initial risk. On top of providing that generic recipe for learning via surrogate PAC-Bayes bounds, we (i) contribute theoretical results establishing that iteratively optimising our surrogates implies the optimisation of the original generalisation bounds, (ii) instantiate this strategy to the framework of meta-learning, introducing a meta-objective offering a closed form expression for meta-gradient, (iii) illustrate our approach with numerical experiments inspired by an industrial biochemical problem.Show less >
Language :
Anglais
Peer reviewed article :
Oui
Audience :
Internationale
Popular science :
Non
ANR Project :
Collections :
Source :
Files
- document
- Open access
- Access the document
- surrogate_pac_bayes_arxiv.pdf
- Open access
- Access the document
- 2410.10230
- Open access
- Access the document