2011 Conference Record of the Forty Fifth Asilomar Conference on Signals, Systems and Computers (ASILOMAR) 2011
DOI: 10.1109/acssc.2011.6190117
|View full text |Cite
|
Sign up to set email alerts
|

Expectation-maximization Bernoulli-Gaussian approximate message passing

Abstract: Abstract-The approximate message passing (AMP) algorithm originally proposed by Donoho, Maleki, and Montanari yields a computationally attractive solution to the usual ℓ1-regularized least-squares problem faced in compressed sensing, whose solution is known to be robust to the signal distribution. When the signal is drawn i.i.d from a marginal distribution that is not least-favorable, better performance can be attained using a Bayesian variation of AMP. The latter, however, assumes that the distribution is per… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
133
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 121 publications
(135 citation statements)
references
References 14 publications
(21 reference statements)
1
133
0
Order By: Relevance
“…We modify adaptive GAMP to calibrate the measurement system by learning S directly from L measurement vectors. Our work thus extends the scenario considered in [7][8][9] where S is assumed to be known.…”
Section: Relation To Prior Workmentioning
confidence: 82%
See 2 more Smart Citations
“…We modify adaptive GAMP to calibrate the measurement system by learning S directly from L measurement vectors. Our work thus extends the scenario considered in [7][8][9] where S is assumed to be known.…”
Section: Relation To Prior Workmentioning
confidence: 82%
“…The algorithm relies on the Central Limit Theorem to simplify loopy belief propagation by replacing continuous-domain convolutions with matrix-vector multiplications followed by pointwise nonlinearities [6,13]. Although the basic GAMP algorithm requires perfect knowledge of S, λ, and v w for reconstruction, it was recently extended to incorporate parameter learning [7][8][9]. In particular, a recently introduced adaptive GAMP method combines maximum-likelihood (ML) estimation with the standard GAMP updates [7].…”
Section: Calibration With Adaptive Gampmentioning
confidence: 99%
See 1 more Smart Citation
“…2) BG model: As mentioned in the introduction, BG model (6)- (7) has already been considered in some contributions ( [28], [29], [32], [33]) and under the marginal formulation (3) in [35], [36]. However all these contributions differ from the proposed approach by the estimation problem and the practical procedure introduced to solve it.…”
Section: ) Boltzmann Machinementioning
confidence: 92%
“…While σ 2 0 can be tuned to any positive real value in the first BG model presented above, it is set to 0 in the second one. This marginal formulation is directly used in many contributions as in [35], [36].…”
Section: A Standard Sparse Representation Algorithmsmentioning
confidence: 99%