Streaming kernel regression with provably ...
Document type :
Compte-rendu et recension critique d'ouvrage
Title :
Streaming kernel regression with provably adaptive mean, variance, and regularization
Author(s) :
Durand, Audrey [Auteur]
Université Laval [Québec] [ULaval]
Maillard, Odalric Ambrym [Auteur]
Sequential Learning [SEQUEL]
Pineau, Joelle [Auteur]
McGill University = Université McGill [Montréal, Canada]
Université Laval [Québec] [ULaval]
Maillard, Odalric Ambrym [Auteur]
Sequential Learning [SEQUEL]
Pineau, Joelle [Auteur]
McGill University = Université McGill [Montréal, Canada]
Journal title :
Journal of Machine Learning Research
Pages :
1 - 48
Publisher :
Microtome Publishing
Publication date :
2018
ISSN :
1532-4435
English keyword(s) :
kernel
regression
online learning
adaptive tuning
bandits
regression
online learning
adaptive tuning
bandits
HAL domain(s) :
Mathématiques [math]/Statistiques [math.ST]
Statistiques [stat]/Machine Learning [stat.ML]
Statistiques [stat]/Machine Learning [stat.ML]
English abstract : [en]
We consider the problem of streaming kernel regression, when the observations arrive sequentially and the goal is to recover the underlying mean function, assumed to belong to an RKHS. The variance of the noise is not ...
Show more >We consider the problem of streaming kernel regression, when the observations arrive sequentially and the goal is to recover the underlying mean function, assumed to belong to an RKHS. The variance of the noise is not assumed to be known. In this context, we tackle the problem of tuning the regularization parameter adaptively at each time step, while maintaining tight confidence bounds estimates on the value of the mean function at each point. To this end, we first generalize existing results for finite-dimensional linear regression with fixed regularization and known variance to the kernel setup with a regularization parameter allowed to be a measurable function of past observations. Then, using appropriate self-normalized inequalities we build upper and lower bound estimates for the variance, leading to Bersntein-like concentration bounds. The later is used in order to define the adaptive regularization. The bounds resulting from our technique are valid uniformly over all observation points and all time steps, and are compared against the literature with numerical experiments. Finally, the potential of these tools is illustrated by an application to kernelized bandits, where we revisit the Kernel UCB and Kernel Thompson Sampling procedures, and show the benefits of the novel adaptive kernel tuning strategy.Show less >
Show more >We consider the problem of streaming kernel regression, when the observations arrive sequentially and the goal is to recover the underlying mean function, assumed to belong to an RKHS. The variance of the noise is not assumed to be known. In this context, we tackle the problem of tuning the regularization parameter adaptively at each time step, while maintaining tight confidence bounds estimates on the value of the mean function at each point. To this end, we first generalize existing results for finite-dimensional linear regression with fixed regularization and known variance to the kernel setup with a regularization parameter allowed to be a measurable function of past observations. Then, using appropriate self-normalized inequalities we build upper and lower bound estimates for the variance, leading to Bersntein-like concentration bounds. The later is used in order to define the adaptive regularization. The bounds resulting from our technique are valid uniformly over all observation points and all time steps, and are compared against the literature with numerical experiments. Finally, the potential of these tools is illustrated by an application to kernelized bandits, where we revisit the Kernel UCB and Kernel Thompson Sampling procedures, and show the benefits of the novel adaptive kernel tuning strategy.Show less >
Language :
Anglais
Popular science :
Non
Collections :
Source :
Files
- https://hal.archives-ouvertes.fr/hal-01927007/document
- Open access
- Access the document
- http://arxiv.org/pdf/1708.00768
- Open access
- Access the document
- https://hal.archives-ouvertes.fr/hal-01927007/document
- Open access
- Access the document
- https://hal.archives-ouvertes.fr/hal-01927007/document
- Open access
- Access the document
- document
- Open access
- Access the document
- 1708.00768.pdf
- Open access
- Access the document
- 1708.00768
- Open access
- Access the document
- document
- Open access
- Access the document
- 1708.00768.pdf
- Open access
- Access the document