2021
DOI: 10.48550/arxiv.2110.06069
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Generalized Memory Approximate Message Passing

Abstract: Generalized approximate message passing (GAMP) is a promising technique for unknown signal reconstruction of generalized linear models (GLM). However, it requires that the transformation matrix has independent and identically distributed (IID) entries. In this context, generalized vector AMP (GVAMP) is proposed for general unitarily-invariant transformation matrices but it has a high-complexity matrix inverse. To this end, we propose a universal generalized memory AMP (GMAMP) framework including the existing o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 25 publications
0
7
0
Order By: Relevance
“…In addition, the CSI acquisition can be further jointly considered in the receiver. Last but not least, in order to achieve the lower implementation complexity, CAMP [28] and MAMP [37], [38] can be redesigned for the coded GMU-MIMO systems instead of MU-OAMP/VAMP. APPENDIX APPENDIX A PROOF OF (13) IN LEMMA 2 In this appendix, we first review the OAMP/VAMP in uncoded linear vector system.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, the CSI acquisition can be further jointly considered in the receiver. Last but not least, in order to achieve the lower implementation complexity, CAMP [28] and MAMP [37], [38] can be redesigned for the coded GMU-MIMO systems instead of MU-OAMP/VAMP. APPENDIX APPENDIX A PROOF OF (13) IN LEMMA 2 In this appendix, we first review the OAMP/VAMP in uncoded linear vector system.…”
Section: Discussionmentioning
confidence: 99%
“…2) Other related low-complexity AMP-type algorithms: Recently, to avoid the high complexity LMMSE in OAMP/VAMP/EP, low-complexity Bayes-optimal convolutional AMP (CAMP) [36], memory AMP (MAMP) [37], and generalized MAMP (GMAMP) [38] were proposed for unitarily-invariant matrices with arbitrary input distributions. Therefore, CAMP, MAMP, and GMAMP may be good candidates with lower complexity for the proposed framework in this paper.…”
Section: E Connection To Existing Workmentioning
confidence: 99%
“…Similar to OAMP/VAMP, GVAMP has a high computational complexity. To this end, a generalized MAMP (GMAMP) was proposed for GLM [27]. The complexity of GMAMP is comparable to GAMP.…”
Section: A Backgroundmentioning
confidence: 99%
“…First, the CAMP [19], long-memory AMP [20], [21] and RI-AMP [28] were consisted of NLEs and an MF with Onsager correction terms, whose structure is similar to that of AMP [3] or GAMP [24]. Second, the MAMP [22], [23] and GMAMP [27] were consisted of orthogonal NLEs and an orthogonal long-memory MF, whose structure is similar to that of OAMP/VAMP [12], [13] or GVAMP [25].…”
Section: A Backgroundmentioning
confidence: 99%
“…Compared to GAMP, GLM-VAMP can be applied to more general random matrices but needs to pay more computational complexity. Similar to GLM-VAMP, a generalized version of MAMP was proposed in [42]. Beyond single-layer model, some extensions of AMP in multi-layer regions can be found in [43], [44], [45], [46].…”
Section: Introductionmentioning
confidence: 99%