2021
DOI: 10.48550/arxiv.2109.11905
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Graph-based Approximate Message Passing Iterations

Abstract: Approximate-message passing (AMP) algorithms have become an important element of highdimensional statistical inference, mostly due to their adaptability and concentration properties, the state evolution (SE) equations. This is demonstrated by the growing number of new iterations proposed for increasingly complex problems, ranging from multi-layer inference to low-rank matrix estimation with elaborate priors. In this paper, we address the following questions: is there a structure underlying all AMP iterations t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(33 citation statements)
references
References 20 publications
0
33
0
Order By: Relevance
“…While the state evolution of this joint optimisation program could be directly proven as well (it is a generic consequence of the recent extension of state evolution theorems for AMP in [20]), we follow a slightly different route for the proof, leveraging instead the results of [26,47,9,30].…”
Section: Main Technical Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…While the state evolution of this joint optimisation program could be directly proven as well (it is a generic consequence of the recent extension of state evolution theorems for AMP in [20]), we follow a slightly different route for the proof, leveraging instead the results of [26,47,9,30].…”
Section: Main Technical Resultsmentioning
confidence: 99%
“…We investigate the relations between the respective uncertainties of the oracle, Bayes and regularized logistic regression. We see this as a first grounding step for a line of future work that will leverage recent extensions of the GAMP algorithm and its associated analysis to multi-layer neural networks [4,20], learning with random features and kernels [34,18,15], estimation under generative priors [5,2], classification on more realistic models of data [21,22,43], etc. In future work we will also study and evaluate more advanced uncertainty estimators, some of those listed in the introduction, in settings where the present methodology gives us a asymptotically exact access to the true Bayesian uncertainty.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The behavior of AMP in the high dimensional limit is tracked by the state evolution equations. The convergence of AMP parameters to the state evolution has been proved under various assumptions (see [20][21][22][23][24][25][26]). Furthermore AMP has been successful as a near optimal decoder for sparse superposition codes [25,[27][28][29][30].…”
Section: Approximate Message Passing (Amp)mentioning
confidence: 99%