2012 46th Annual Conference on Information Sciences and Systems (CISS) 2012
DOI: 10.1109/ciss.2012.6310932
|View full text |Cite
|
Sign up to set email alerts
|

Expectation-maximization Gaussian-mixture approximate message passing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
302
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 130 publications
(304 citation statements)
references
References 15 publications
2
302
0
Order By: Relevance
“…2. Using simple message passing rules accelerated by generalized approximate message passing (GAMP) [10], [9], we can perform fast approximate inference to obtain a minimum mean squared error (MMSE) estimate of the complex reflectivity of the scene. Movers then can be directly inferred from the posterior probabilities on x i and c i produced by the iterative message passing.…”
Section: Knowledge-aided Gmti In a Bayesian Frameworkmentioning
confidence: 99%
See 1 more Smart Citation
“…2. Using simple message passing rules accelerated by generalized approximate message passing (GAMP) [10], [9], we can perform fast approximate inference to obtain a minimum mean squared error (MMSE) estimate of the complex reflectivity of the scene. Movers then can be directly inferred from the posterior probabilities on x i and c i produced by the iterative message passing.…”
Section: Knowledge-aided Gmti In a Bayesian Frameworkmentioning
confidence: 99%
“…The p(c i ) incorporates the prior information derived from the DEM. Finally, the model parameters are automatically tuned in the algorithm using an expectation-maximization procedure [9].…”
Section: Knowledge-aided Gmti In a Bayesian Frameworkmentioning
confidence: 99%
“…While being highly effective in general, both [20] and [21] have limitations. For example, the mixture using normal components in [20] is known to be sensitive to outliers, and the performance degrades with smaller sample size [23].…”
Section: Introductionmentioning
confidence: 97%
“…For example, the mixture using normal components in [20] is known to be sensitive to outliers, and the performance degrades with smaller sample size [23]. Meanwhile, the work [21] is designed exclusively for nonnegative signals, and is not capable in handling signals with both positive and negative significant elements.…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation