ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019
DOI: 10.1109/icassp.2019.8683082
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating Iterative Hard Thresholding for Low-rank Matrix Completion via Adaptive Restart

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
13
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 12 publications
(13 citation statements)
references
References 15 publications
0
13
0
Order By: Relevance
“…Assume the same setting as in Theorem 3. As m → ∞, the linear rate ρ defined in (8) converges almost surely to p ∞ defined in (16).…”
Section: Proposed Estimation Of the Linear Rate ρmentioning
confidence: 99%
See 2 more Smart Citations
“…Assume the same setting as in Theorem 3. As m → ∞, the linear rate ρ defined in (8) converges almost surely to p ∞ defined in (16).…”
Section: Proposed Estimation Of the Linear Rate ρmentioning
confidence: 99%
“…On the one hand, interior-point methods for solving the nuclear norm minimization problem are computationally expensive and even infeasible for large matrices. On the other hand, proximal-type algorithms suffer from slow convergence due to the conservative nature of the soft-thresholding operator [15], [16].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…When the size of the matrix grows rapidly, storing and optimizing over a matrix variable become computationally expensive and even infeasible. In addition, it is evident this approach suffers from slow convergence [9,10]. In the second approach, the original rank-constrained optimization is studied.…”
Section: Introductionmentioning
confidence: 99%
“…Thus, basic optimization algorithms such as gradient descent [12,14,15] and alternating minimization [16][17][18][19] can provably solve matrix completion under a specific sampling regime. Alternatively, the original rankconstrained optimization problem can be solved without the aforementioned reparameterization via the truncated singular value decomposition [10,[20][21][22][23][24][25].…”
Section: Introductionmentioning
confidence: 99%