2017
DOI: 10.1109/tsp.2017.2713759
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid Approximate Message Passing

Abstract: Abstract-Gaussian and quadratic approximations of message passing algorithms on graphs have attracted considerable recent attention due to their computational simplicity, analytic tractability, and wide applicability in optimization and statistical inference problems. This paper presents a systematic framework for incorporating such approximate message passing (AMP) methods in general graphical models. The key concept is a partition of dependencies of a general graphical model into strong and weak edges, with … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
39
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
2

Relationship

3
6

Authors

Journals

citations
Cited by 43 publications
(40 citation statements)
references
References 53 publications
0
39
0
Order By: Relevance
“…estimand c and separable likelihood p(y|z) = M m=1 p(y m |z m ). Thus, Hybrid GAMP (HyGAMP) [18] was developed to tackle problems with a structured prior and/or likelihood. HyGAMP could be applied to the compressive learning problem described in Section II-A, but it would require computing and inverting O(N+M ) covariance matrices of dimension K at each iteration.…”
Section: B Approximate Message Passingmentioning
confidence: 99%
“…estimand c and separable likelihood p(y|z) = M m=1 p(y m |z m ). Thus, Hybrid GAMP (HyGAMP) [18] was developed to tackle problems with a structured prior and/or likelihood. HyGAMP could be applied to the compressive learning problem described in Section II-A, but it would require computing and inverting O(N+M ) covariance matrices of dimension K at each iteration.…”
Section: B Approximate Message Passingmentioning
confidence: 99%
“…Fortunately, it is known that a basis is short and nearly orthogonal after lattice reduction, which means its column-wise dependency is small. Moreover, a reduced basis in VP often has "small" entries (in the sense of [35]) such that the approximations in AMP are valid. We further justified the two arguments above in Appendix A.…”
Section: B Prerequisites For Ampmentioning
confidence: 99%
“…estimand c and separable likelihood p(y|z) = M m=1 p(y m |z m ). Thus, Hybrid GAMP (HyGAMP) [13] was developed to tackle problems with a structured prior and/or likelihood. HyGAMP could be applied to (10)- (11), but it requires computing and inverting O(N +M ) covariance matrices of dimension K at each iteration.…”
Section: B Approximate Message Passingmentioning
confidence: 99%
“…Require: Measurements y, matrix W with ∥ W ∥ 2 F = M , pdfs p c|r and p z|y,p from (12)- (13), initializations C = E{C}, q c n = diag(cov{cn}) Ensure: S ← 0.…”
Section: Algorithm 1 Shygampmentioning
confidence: 99%