2014
DOI: 10.1109/tsp.2014.2357773
|View full text |Cite
|
Sign up to set email alerts
|

Bilinear Generalized Approximate Message Passing—Part II: Applications

Abstract: Abstract-In this paper, we extend the generalized approximate message passing (G-AMP) approach, originally proposed for highdimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case. In Part I of this two-part paper, we derived our Bilinear G-AMP (BiG-AMP) algorithm as an approximation of the sum-product belief propagation algorithm in the high-dimensional limit, and proposed an adaptive damping mechanism that aids convergence under finite problem sizes, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
58
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 70 publications
(58 citation statements)
references
References 19 publications
0
58
0
Order By: Relevance
“…In addition, we proposed an adaptive damping mechanism to aid convergence under realistic problem sizes, an expectation-maximization (EM)-based method to automatically tune the parameters of the assumed priors, and two rank-selection strategies. In Part II [1] of this two-part work, we detail the application of BiG-AMP to matrix completion, robust PCA, and dictionary learning, and we present the results of an extensive numerical investigation into the performance of BiG-AMP on both synthetic and real-world datasets. The results in [1] demonstrate that BiG-AMP yields excellent reconstruction 10 In some cases the singular values of could be used instead.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…In addition, we proposed an adaptive damping mechanism to aid convergence under realistic problem sizes, an expectation-maximization (EM)-based method to automatically tune the parameters of the assumed priors, and two rank-selection strategies. In Part II [1] of this two-part work, we detail the application of BiG-AMP to matrix completion, robust PCA, and dictionary learning, and we present the results of an extensive numerical investigation into the performance of BiG-AMP on both synthetic and real-world datasets. The results in [1] demonstrate that BiG-AMP yields excellent reconstruction 10 In some cases the singular values of could be used instead.…”
Section: Discussionmentioning
confidence: 99%
“…In high-dimensional inference problems, exact implementation of the SPA is impractical, motivating approximations of the 1 Another worthwhile objective could be to compute the joint MAP estimate ; we leave this to future work. [33] to solve the generalized CS problem, which exploits the "blessings of dimensionality" that arise when is a sufficiently large and dense and which was rigorously analyzed in [35].…”
Section: B Loopy Belief Propagationmentioning
confidence: 99%
See 3 more Smart Citations