Model based Bayesian compressive sensing ...
Document type :
Compte-rendu et recension critique d'ouvrage
Title :
Model based Bayesian compressive sensing via Local Beta Process
Author(s) :
Yu, Lei [Auteur]
Sun, Hong [Auteur]
Zheng, Gang [Auteur]
Laboratoire d'Automatique, Génie Informatique et Signal [LAGIS]
Non-Asymptotic estimation for online systems [NON-A]
Barbot, Jean-Pierre [Auteur]
Non-Asymptotic estimation for online systems [NON-A]
Électronique et Commande des Systèmes Laboratoire [ECS-Lab]
Sun, Hong [Auteur]
Zheng, Gang [Auteur]

Laboratoire d'Automatique, Génie Informatique et Signal [LAGIS]
Non-Asymptotic estimation for online systems [NON-A]
Barbot, Jean-Pierre [Auteur]
Non-Asymptotic estimation for online systems [NON-A]
Électronique et Commande des Systèmes Laboratoire [ECS-Lab]
Journal title :
Signal Processing
Pages :
259–271
Publisher :
Elsevier
Publication date :
2015-03-30
ISSN :
0165-1684
HAL domain(s) :
Informatique [cs]/Traitement du signal et de l'image [eess.SP]
English abstract : [en]
In the framework of Compressive Sensing (CS), the inherent structures underlying sparsity patterns can be exploited to promote the reconstruction accuracy and robustness. And this consideration results in a new extension ...
Show more >In the framework of Compressive Sensing (CS), the inherent structures underlying sparsity patterns can be exploited to promote the reconstruction accuracy and robustness. And this consideration results in a new extension for CS, called model based CS. In this paper, we propose a general statistical framework for model based CS, where both sparsity and structure priors are considered simultaneously. By exploiting the Latent Variable Analysis (LVA), a sparse signal is split into weight variables representing values of elements and latent variables indicating labels of elements. Then the Gamma-Gaussian model is exploited to describe weight variables to induce sparsity, while the beta process is assumed on each of the local clusters to describe inherent structures. Since the complete model is an extension of Bayesian CS and the process is for local properties, it is called Model based Bayesian CS via Local Beta Process (MBCS-LBP). Moreover, the beta process is a Bayesian conjugate prior to the Bernoulli Process, as well as the Gamma to Gaussian distribution, thus it allows for an analytical posterior inference through a variational Bayes inference algorithm and hence leads to a deterministic VB-EM iterative algorithm.Show less >
Show more >In the framework of Compressive Sensing (CS), the inherent structures underlying sparsity patterns can be exploited to promote the reconstruction accuracy and robustness. And this consideration results in a new extension for CS, called model based CS. In this paper, we propose a general statistical framework for model based CS, where both sparsity and structure priors are considered simultaneously. By exploiting the Latent Variable Analysis (LVA), a sparse signal is split into weight variables representing values of elements and latent variables indicating labels of elements. Then the Gamma-Gaussian model is exploited to describe weight variables to induce sparsity, while the beta process is assumed on each of the local clusters to describe inherent structures. Since the complete model is an extension of Bayesian CS and the process is for local properties, it is called Model based Bayesian CS via Local Beta Process (MBCS-LBP). Moreover, the beta process is a Bayesian conjugate prior to the Bernoulli Process, as well as the Gamma to Gaussian distribution, thus it allows for an analytical posterior inference through a variational Bayes inference algorithm and hence leads to a deterministic VB-EM iterative algorithm.Show less >
Language :
Anglais
Popular science :
Non
Collections :
Source :