2022
DOI: 10.1109/tsp.2022.3160004
|View full text |Cite
|
Sign up to set email alerts
|

Low-Complexity ADMM-Based Algorithm for Robust Multi-Group Multicast Beamforming in Large-Scale Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 39 publications
0
4
0
Order By: Relevance
“…Theoretical and experimental data indicate that the matrix obtained in Equation (10) still requires diagonal processing to achieve normal noise levels, with the loading amount being the average power of the noise. The specific expression is as follows (11) Among them, is the estimated value of noise power, which can be equivalently replaced by 10 times the minimum eigenvalue of the data covariance matrix.…”
Section: The Proposed Beamforming Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Theoretical and experimental data indicate that the matrix obtained in Equation (10) still requires diagonal processing to achieve normal noise levels, with the loading amount being the average power of the noise. The specific expression is as follows (11) Among them, is the estimated value of noise power, which can be equivalently replaced by 10 times the minimum eigenvalue of the data covariance matrix.…”
Section: The Proposed Beamforming Methodsmentioning
confidence: 99%
“…Therefore, how to improve the robustness of adaptive beamforming algorithms in non-ideal scenarios is a major research direction [10][11][12] .…”
Section: Introductionmentioning
confidence: 99%
“…where S 0 = B • S. This is a nonconvex function and cannot be solved directly. Therefore, we adopt the alternating direction method of multipliers (ADMM) [19] to solve the optimization problem. First, an auxiliary variable U is introduced to transform the retinex energy functional into a convex optimization problem.…”
Section: Proposed Model and Optimizationmentioning
confidence: 99%
“…Typical solutions to distributed optimization problems, which have been widely investigated recently, can be categorized into three classes in general, namely, the alternating direction method of multipliers (ADMM) [3], distributed primaldual methods [4], and distributed subgradient methods (DSG). ADMM is an algorithm that solves convex optimization problems by dividing them into smaller subproblems.…”
Section: Introductionmentioning
confidence: 99%