Analyse d'une règle d'arrêt prématuré basée ...
Type de document :
Compte-rendu et recension critique d'ouvrage
Titre :
Analyse d'une règle d'arrêt prématuré basée sur le principe de discrépance pour les algorithmes à filtrage spectral
Auteur(s) :
Celisse, Alain [Auteur]
MOdel for Data Analysis and Learning [MODAL]
Statistique, Analyse et Modélisation Multidisciplinaire (SAmos-Marin Mersenne) [SAMM]
Wahl, Martin [Auteur]
Humboldt-Universität zu Berlin = Humboldt University of Berlin = Université Humboldt de Berlin [HU Berlin]
![refId](/themes/Mirage2//images/idref.png)
MOdel for Data Analysis and Learning [MODAL]
Statistique, Analyse et Modélisation Multidisciplinaire (SAmos-Marin Mersenne) [SAMM]
Wahl, Martin [Auteur]
Humboldt-Universität zu Berlin = Humboldt University of Berlin = Université Humboldt de Berlin [HU Berlin]
Titre de la revue :
Journal of Machine Learning Research
Éditeur :
Microtome Publishing
Date de publication :
2021-01-01
ISSN :
1532-4435
Mot(s)-clé(s) en anglais :
early stopping
discrepancy principle
non-parametric re- gression
spectral regularization
reproducing kernel Hilbert space
oracle inequality
effective dimension
discrepancy principle
non-parametric re- gression
spectral regularization
reproducing kernel Hilbert space
oracle inequality
effective dimension
Discipline(s) HAL :
Mathématiques [math]/Statistiques [math.ST]
Statistiques [stat]
Statistiques [stat]/Théorie [stat.TH]
Statistiques [stat]/Machine Learning [stat.ML]
Statistiques [stat]
Statistiques [stat]/Théorie [stat.TH]
Statistiques [stat]/Machine Learning [stat.ML]
Résumé en anglais : [en]
We investigate the construction of early stopping rules in the non-parametric regression problem where iterative learning algorithms are used and the optimal iteration number is unknown. More precisely, we study the ...
Lire la suite >We investigate the construction of early stopping rules in the non-parametric regression problem where iterative learning algorithms are used and the optimal iteration number is unknown. More precisely, we study the discrepancy principle, as well as modifications based on smoothed residuals, for kernelized spectral filter learning algorithms including gradient descent. Our main theoretical bounds are oracle inequalities established for the empirical estimation error (fixed design), and for the prediction error (random design). From these finite-sample bounds it follows that the classical discrepancy principle is statistically adaptive for slow rates occurring in the hard learning scenario, while the smoothed discrepancy principles are adaptive over ranges of faster rates (resp. higher smoothness parameters). Our approach relies on deviation inequalities for the stopping rules in the fixed design setting, combined with change-of-norm arguments to deal with the random design setting.Lire moins >
Lire la suite >We investigate the construction of early stopping rules in the non-parametric regression problem where iterative learning algorithms are used and the optimal iteration number is unknown. More precisely, we study the discrepancy principle, as well as modifications based on smoothed residuals, for kernelized spectral filter learning algorithms including gradient descent. Our main theoretical bounds are oracle inequalities established for the empirical estimation error (fixed design), and for the prediction error (random design). From these finite-sample bounds it follows that the classical discrepancy principle is statistically adaptive for slow rates occurring in the hard learning scenario, while the smoothed discrepancy principles are adaptive over ranges of faster rates (resp. higher smoothness parameters). Our approach relies on deviation inequalities for the stopping rules in the fixed design setting, combined with change-of-norm arguments to deal with the random design setting.Lire moins >
Langue :
Anglais
Vulgarisation :
Non
Collections :
Source :
Fichiers
- document
- Accès libre
- Accéder au document
- 20-358.pdf
- Accès libre
- Accéder au document
- 2004.08436
- Accès libre
- Accéder au document