Small variance asymptotics and bayesian ...
Type de document :
Autre communication scientifique (congrès sans actes - poster - séminaire...): Communication dans un congrès avec actes
Titre :
Small variance asymptotics and bayesian nonparametrics for dictionary learning
Auteur(s) :
Elvira, Clément [Auteur]
Parcimonie et Nouveaux Algorithmes pour le Signal et la Modélisation Audio [PANAMA]
Dang, Hong-Phuong [Auteur correspondant]
Centre de Recherche en Économie et Statistique [CREST]
Ecole Nationale de la Statistique et de l'Analyse de l'Information [Bruz] [ENSAI]
Chainais, Pierre [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Parcimonie et Nouveaux Algorithmes pour le Signal et la Modélisation Audio [PANAMA]
Dang, Hong-Phuong [Auteur correspondant]
Centre de Recherche en Économie et Statistique [CREST]
Ecole Nationale de la Statistique et de l'Analyse de l'Information [Bruz] [ENSAI]
Chainais, Pierre [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Titre de la manifestation scientifique :
EUSIPCO 2018 - 26th European Signal Processing Conference
Ville :
Rome
Pays :
Italie
Date de début de la manifestation scientifique :
2018-09-03
Mot(s)-clé(s) en anglais :
Indian Buffet Process
sparse representations
Bayesian nonparametrics
small variance asymptotic
inverse problems
dictionary learning
sparse representations
Bayesian nonparametrics
small variance asymptotic
inverse problems
dictionary learning
Discipline(s) HAL :
Informatique [cs]/Traitement du signal et de l'image [eess.SP]
Informatique [cs]/Traitement des images [eess.IV]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Machine Learning [stat.ML]
Informatique [cs]/Traitement des images [eess.IV]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Machine Learning [stat.ML]
Résumé en anglais : [en]
Bayesian nonparametric (BNP) is an appealing framework to infer the complexity of a model along with the parameters. To this aim, sampling or variational methods are often used for inference. However, these methods come ...
Lire la suite >Bayesian nonparametric (BNP) is an appealing framework to infer the complexity of a model along with the parameters. To this aim, sampling or variational methods are often used for inference. However, these methods come with numerical disadvantages for large-scale data. An alternative approach is to relax the probabilistic model into a non-probabilistic formulation which yields a scalable algorithm. One limitation of BNP approaches can be the cost of Monte-Carlo sampling for inference. Small-variance asymptotic (SVA) approaches paves the way to much cheaper though approximate methods for inference by taking benefit from a fruitful interaction between Bayesian models and optimization algorithms. In brief, SVA lets the variance of the noise (or residual error) distribution tend to zero in the optimization problem corresponding to a MAP estimator with finite noise variance for instance. We propose such an SVA analysis of a BNP dictionary learning (DL) approach that automatically adapts the size of the dictionary or the subspace dimension in an efficient way. Numerical experiments illustrate the efficiency of the proposed method.Lire moins >
Lire la suite >Bayesian nonparametric (BNP) is an appealing framework to infer the complexity of a model along with the parameters. To this aim, sampling or variational methods are often used for inference. However, these methods come with numerical disadvantages for large-scale data. An alternative approach is to relax the probabilistic model into a non-probabilistic formulation which yields a scalable algorithm. One limitation of BNP approaches can be the cost of Monte-Carlo sampling for inference. Small-variance asymptotic (SVA) approaches paves the way to much cheaper though approximate methods for inference by taking benefit from a fruitful interaction between Bayesian models and optimization algorithms. In brief, SVA lets the variance of the noise (or residual error) distribution tend to zero in the optimization problem corresponding to a MAP estimator with finite noise variance for instance. We propose such an SVA analysis of a BNP dictionary learning (DL) approach that automatically adapts the size of the dictionary or the subspace dimension in an efficient way. Numerical experiments illustrate the efficiency of the proposed method.Lire moins >
Langue :
Anglais
Comité de lecture :
Oui
Audience :
Internationale
Vulgarisation :
Non
Collections :
Source :
Fichiers
- https://hal.archives-ouvertes.fr/hal-01961852/document
- Accès libre
- Accéder au document
- https://hal.archives-ouvertes.fr/hal-01961852/document
- Accès libre
- Accéder au document
- https://hal.archives-ouvertes.fr/hal-01961852/document
- Accès libre
- Accéder au document
- document
- Accès libre
- Accéder au document
- 2018_EUSIPCO.pdf
- Accès libre
- Accéder au document
- document
- Accès libre
- Accéder au document
- 2018_EUSIPCO.pdf
- Accès libre
- Accéder au document