2013
DOI: 10.1137/110859646
|View full text |Cite
|
Sign up to set email alerts
|

Low-Rank Optimization with Trace Norm Penalty

Abstract: The paper addresses the problem of low-rank trace norm minimization. We propose an algorithm that alternates between fixed-rank optimization and rank-one updates. The fixed-rank optimization is characterized by an efficient factorization that makes the trace norm differentiable in the search space and the computation of duality gap numerically tractable. The search space is nonlinear but is equipped with a particular Riemannian structure that leads to efficient computations. We present a second-order trust-reg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
121
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 104 publications
(126 citation statements)
references
References 24 publications
0
121
0
Order By: Relevance
“…Several techniques have been proposed to address (1)-or more specific instances thereof-by exploiting the fact that M r is a submanifold of the Euclidean space R m×n ; see, e.g., [MMBS13a, The set M r is known to be a submanifold of dimension (m+n−r)r embedded in the Euclidean space R m×n [Lee03,Example 8.14]. The low-rank optimization problem (1) is thus in the field of play of Riemannian optimization; see, e.g., [AMS08].…”
Section: Introductionmentioning
confidence: 99%
“…Several techniques have been proposed to address (1)-or more specific instances thereof-by exploiting the fact that M r is a submanifold of the Euclidean space R m×n ; see, e.g., [MMBS13a, The set M r is known to be a submanifold of dimension (m+n−r)r embedded in the Euclidean space R m×n [Lee03,Example 8.14]. The low-rank optimization problem (1) is thus in the field of play of Riemannian optimization; see, e.g., [AMS08].…”
Section: Introductionmentioning
confidence: 99%
“…The authors propose an efficient SVD-based initial guess for U and V which they refine using a Riemannian steepest descent method [2], along with strong theoretical guarantees. Ngo and Saad [33] exploit this idea further by applying a Riemannian conjugate gradient method to this formulation. They endow the Grassmannians with a preconditioned metric in order to better capture the conditioning of low-rank matrix completion, with excellent results.…”
Section: Related Workmentioning
confidence: 99%
“…Mishra et al [32] propose another geometric approach. They address the problem of low-rank trace norm minimization and propose an algorithm that alternates between fixed-rank optimization and rank-one updates, with applications to low-rank matrix completion.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations