Model based Bayesian compressive sensing ...
Type de document :
Compte-rendu et recension critique d'ouvrage
Titre :
Model based Bayesian compressive sensing via Local Beta Process
Auteur(s) :
Yu, Lei [Auteur]
Sun, Hong [Auteur]
Zheng, Gang [Auteur]
Non-Asymptotic estimation for online systems [NON-A]
Laboratoire d'Automatique, Génie Informatique et Signal [LAGIS]
Barbot, Jean-Pierre [Auteur]
Électronique et Commande des Systèmes Laboratoire [ECS-Lab]
Non-Asymptotic estimation for online systems [NON-A]
Sun, Hong [Auteur]
Zheng, Gang [Auteur]
Non-Asymptotic estimation for online systems [NON-A]
Laboratoire d'Automatique, Génie Informatique et Signal [LAGIS]
Barbot, Jean-Pierre [Auteur]
Électronique et Commande des Systèmes Laboratoire [ECS-Lab]
Non-Asymptotic estimation for online systems [NON-A]
Titre de la revue :
Signal Processing
Pagination :
259–271
Éditeur :
Elsevier
Date de publication :
2015-03-30
ISSN :
0165-1684
Discipline(s) HAL :
Informatique [cs]/Traitement du signal et de l'image [eess.SP]
Résumé en anglais : [en]
In the framework of Compressive Sensing (CS), the inherent structures underlying sparsity patterns can be exploited to promote the reconstruction accuracy and robustness. And this consideration results in a new extension ...
Lire la suite >In the framework of Compressive Sensing (CS), the inherent structures underlying sparsity patterns can be exploited to promote the reconstruction accuracy and robustness. And this consideration results in a new extension for CS, called model based CS. In this paper, we propose a general statistical framework for model based CS, where both sparsity and structure priors are considered simultaneously. By exploiting the Latent Variable Analysis (LVA), a sparse signal is split into weight variables representing values of elements and latent variables indicating labels of elements. Then the Gamma-Gaussian model is exploited to describe weight variables to induce sparsity, while the beta process is assumed on each of the local clusters to describe inherent structures. Since the complete model is an extension of Bayesian CS and the process is for local properties, it is called Model based Bayesian CS via Local Beta Process (MBCS-LBP). Moreover, the beta process is a Bayesian conjugate prior to the Bernoulli Process, as well as the Gamma to Gaussian distribution, thus it allows for an analytical posterior inference through a variational Bayes inference algorithm and hence leads to a deterministic VB-EM iterative algorithm.Lire moins >
Lire la suite >In the framework of Compressive Sensing (CS), the inherent structures underlying sparsity patterns can be exploited to promote the reconstruction accuracy and robustness. And this consideration results in a new extension for CS, called model based CS. In this paper, we propose a general statistical framework for model based CS, where both sparsity and structure priors are considered simultaneously. By exploiting the Latent Variable Analysis (LVA), a sparse signal is split into weight variables representing values of elements and latent variables indicating labels of elements. Then the Gamma-Gaussian model is exploited to describe weight variables to induce sparsity, while the beta process is assumed on each of the local clusters to describe inherent structures. Since the complete model is an extension of Bayesian CS and the process is for local properties, it is called Model based Bayesian CS via Local Beta Process (MBCS-LBP). Moreover, the beta process is a Bayesian conjugate prior to the Bernoulli Process, as well as the Gamma to Gaussian distribution, thus it allows for an analytical posterior inference through a variational Bayes inference algorithm and hence leads to a deterministic VB-EM iterative algorithm.Lire moins >
Langue :
Anglais
Vulgarisation :
Non
Collections :
Source :