1998
DOI: 10.1023/a:1022675100677
|View full text |Cite
|
Sign up to set email alerts
|

Subgradient Algorithm on Riemannian Manifolds

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
66
0

Year Published

2005
2005
2021
2021

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 91 publications
(66 citation statements)
references
References 14 publications
0
66
0
Order By: Relevance
“…It is the Hessian of a C ∞ strictly convex selfconcordant function (see Definition 2.1.1 of Nesterov and Nemirovskii [18]), allowing the introduction of new interior point algorithms for convex optimization problems, as proximal and subgradient. We observe that general convergence results could be applied for those methods, through, respectively, the theory developed in [10] and [9].…”
Section: Steepest Descent Algorithms For the Hypercubementioning
confidence: 96%
“…It is the Hessian of a C ∞ strictly convex selfconcordant function (see Definition 2.1.1 of Nesterov and Nemirovskii [18]), allowing the introduction of new interior point algorithms for convex optimization problems, as proximal and subgradient. We observe that general convergence results could be applied for those methods, through, respectively, the theory developed in [10] and [9].…”
Section: Steepest Descent Algorithms For the Hypercubementioning
confidence: 96%
“…It has been frequently done in recent years, with a theoretical purpose and also to obtain effective algorithms; see [1][2][3][4][5][6][7][8][9]. In particular, we observe that, these extensions allow the solving of some nonconvex constrained problems in Euclidean space.…”
Section: Introductionmentioning
confidence: 97%
“…We have the following convergence result, see [31]. For manifolds with curvature bounded from below the subgradient algorithm converges if the iterates stay in bounded sets, see [61] or [62].…”
Section: Subgradient Descentmentioning
confidence: 99%