Efficient Sampling of Bernoulli-Gaussian-Mixtures for Sparse Signal Restoration
This paper introduces a new family of prior models called Bernoulli-Gaussian-Mixtures (BGM), with a view to efficiently address sparse linear inverse problems or sparse linear regression, in the Bayesian framework. The BGM family is based on continuous Location and Scale Mixtures of Gaussians (LSMG)...
Saved in:
Published in: | IEEE transactions on signal processing Vol. 70; pp. 5578 - 5591 |
---|---|
Main Authors: | , , |
Format: | Journal Article |
Language: | English |
Published: |
New York
IEEE
01-01-2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Institute of Electrical and Electronics Engineers |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper introduces a new family of prior models called Bernoulli-Gaussian-Mixtures (BGM), with a view to efficiently address sparse linear inverse problems or sparse linear regression, in the Bayesian framework. The BGM family is based on continuous Location and Scale Mixtures of Gaussians (LSMG), which includes a wide range of symmetric and asymmetric heavy-tailed probability distributions. Particular attention is paid to the decomposition of probability laws as Gaussian mixtures, from which we derive a Partially Collapsed Gibbs Sampler (PCGS) for the BGM, in a systematic way. PCGS is shown to be more efficient than the standard Gibbs sampler, both in terms of number of iterations and CPU time. Moreover, special attention is paid to BGM involving a density defined over a real half-line. An asymptotically exact LSMG approximation is introduced, which allows us to expand the applicability of PCGS to cases such as BGM models with a non-negative support. |
---|---|
ISSN: | 1053-587X 1941-0476 |
DOI: | 10.1109/TSP.2022.3223775 |