Small variance asymptotics and bayesian ...
Document type :
Autre communication scientifique (congrès sans actes - poster - séminaire...): Communication dans un congrès avec actes
Title :
Small variance asymptotics and bayesian nonparametrics for dictionary learning
Author(s) :
Elvira, Clément [Auteur]
Parcimonie et Nouveaux Algorithmes pour le Signal et la Modélisation Audio [PANAMA]
Dang, Hong-Phuong [Auteur correspondant]
Ecole Nationale de la Statistique et de l'Analyse de l'Information [Bruz] [ENSAI]
Centre de Recherche en Économie et Statistique [CREST]
Chainais, Pierre [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Parcimonie et Nouveaux Algorithmes pour le Signal et la Modélisation Audio [PANAMA]
Dang, Hong-Phuong [Auteur correspondant]
Ecole Nationale de la Statistique et de l'Analyse de l'Information [Bruz] [ENSAI]
Centre de Recherche en Économie et Statistique [CREST]
Chainais, Pierre [Auteur]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Conference title :
EUSIPCO 2018 - 26th European Signal Processing Conference
City :
Rome
Country :
Italie
Start date of the conference :
2018-09-03
English keyword(s) :
Indian Buffet Process
sparse representations
Bayesian nonparametrics
small variance asymptotic
inverse problems
dictionary learning
sparse representations
Bayesian nonparametrics
small variance asymptotic
inverse problems
dictionary learning
HAL domain(s) :
Informatique [cs]/Traitement du signal et de l'image [eess.SP]
Informatique [cs]/Traitement des images [eess.IV]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Machine Learning [stat.ML]
Informatique [cs]/Traitement des images [eess.IV]
Informatique [cs]/Apprentissage [cs.LG]
Statistiques [stat]/Machine Learning [stat.ML]
English abstract : [en]
Bayesian nonparametric (BNP) is an appealing framework to infer the complexity of a model along with the parameters. To this aim, sampling or variational methods are often used for inference. However, these methods come ...
Show more >Bayesian nonparametric (BNP) is an appealing framework to infer the complexity of a model along with the parameters. To this aim, sampling or variational methods are often used for inference. However, these methods come with numerical disadvantages for large-scale data. An alternative approach is to relax the probabilistic model into a non-probabilistic formulation which yields a scalable algorithm. One limitation of BNP approaches can be the cost of Monte-Carlo sampling for inference. Small-variance asymptotic (SVA) approaches paves the way to much cheaper though approximate methods for inference by taking benefit from a fruitful interaction between Bayesian models and optimization algorithms. In brief, SVA lets the variance of the noise (or residual error) distribution tend to zero in the optimization problem corresponding to a MAP estimator with finite noise variance for instance. We propose such an SVA analysis of a BNP dictionary learning (DL) approach that automatically adapts the size of the dictionary or the subspace dimension in an efficient way. Numerical experiments illustrate the efficiency of the proposed method.Show less >
Show more >Bayesian nonparametric (BNP) is an appealing framework to infer the complexity of a model along with the parameters. To this aim, sampling or variational methods are often used for inference. However, these methods come with numerical disadvantages for large-scale data. An alternative approach is to relax the probabilistic model into a non-probabilistic formulation which yields a scalable algorithm. One limitation of BNP approaches can be the cost of Monte-Carlo sampling for inference. Small-variance asymptotic (SVA) approaches paves the way to much cheaper though approximate methods for inference by taking benefit from a fruitful interaction between Bayesian models and optimization algorithms. In brief, SVA lets the variance of the noise (or residual error) distribution tend to zero in the optimization problem corresponding to a MAP estimator with finite noise variance for instance. We propose such an SVA analysis of a BNP dictionary learning (DL) approach that automatically adapts the size of the dictionary or the subspace dimension in an efficient way. Numerical experiments illustrate the efficiency of the proposed method.Show less >
Language :
Anglais
Peer reviewed article :
Oui
Audience :
Internationale
Popular science :
Non
Collections :
Source :
Files
- https://hal.archives-ouvertes.fr/hal-01961852/document
- Open access
- Access the document
- https://hal.archives-ouvertes.fr/hal-01961852/document
- Open access
- Access the document
- https://hal.archives-ouvertes.fr/hal-01961852/document
- Open access
- Access the document
- document
- Open access
- Access the document
- 2018_EUSIPCO.pdf
- Open access
- Access the document
- document
- Open access
- Access the document
- 2018_EUSIPCO.pdf
- Open access
- Access the document