2020
DOI: 10.1007/s10915-020-01173-5
|View full text |Cite
|
Sign up to set email alerts
|

A Riemannian Optimization Approach for Solving the Generalized Eigenvalue Problem for Nonsquare Matrix Pencils

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(15 citation statements)
references
References 18 publications
0
12
0
Order By: Relevance
“…and where M v andP are defined as in (19). Hence, by using the projection P (V, P ) given in (10), we obtain grad f (V, P) = P (V,P) ( ∇f (V, P)…”
Section: Riemannian Gradient and Hessianmentioning
confidence: 99%
See 2 more Smart Citations
“…and where M v andP are defined as in (19). Hence, by using the projection P (V, P ) given in (10), we obtain grad f (V, P) = P (V,P) ( ∇f (V, P)…”
Section: Riemannian Gradient and Hessianmentioning
confidence: 99%
“…The Riemannian Hessian Hess f of f on St(n, l, C) × St(2l, l, C) can also be expressed by using ∇ 2 f and the projection map (10). That is,…”
Section: Riemannian Gradient and Hessianmentioning
confidence: 99%
See 1 more Smart Citation
“…By using the classical expression of the Riemannian connection on a Riemannian submanifold of a Euclidean space (see [32], §5.3.3) and choosing ∇ ξ ζ ≔ P X (Dζ (X)[ξ]), the Riemannian Hessian hess f of f on St(n, p) can also be expressed by using ∇ 2 f and the projection map (15). at is,…”
Section: Lemmamentioning
confidence: 99%
“…Optimization over the Stiefel manifold is an important special case of Riemannian optimization, which has recently aroused considerable research interests due to the wide applications in different fields such as the linear eigenvalue problem, the orthogonal Procrustes problem, the nearest low-rank correlation matrix problem, the Kohn-Sham total energy minimization, and singular value decomposition. Since optimization over the Stiefel manifold can be viewed as a general nonlinear optimization problem with constraints, many standard algorithms [11] in the Euclidean space can be generalized to manifold setting directly and have been explored and successfully applied to various applications, e.g., Riemannian steepest descent method [12], Riemannian curvilinear search method with Barzilai-Borwein (BB) steps [13], Riemannian Dai's nonmonotone-based conjugate gradient method [14,15], Riemannian Polak-Ribière-Polyak-based nonlinear conjugate gradient method [16,17], and Riemannian Fletcher-Reeves-based conjugate gradient method [18]. However, as we know, gradient-type algorithms often perform reasonably well but might converge slowly when the generated iterates are close to an optimal solution.…”
Section: Introductionmentioning
confidence: 99%