2006
DOI: 10.1007/s10898-005-6741-9
|View full text |Cite
|
Sign up to set email alerts
|

Convex- and Monotone-Transformable Mathematical Programming Problems and a Proximal-Like Point Method

Abstract: The problem of finding singularities of monotone vectors fields on Hadamard manifolds will be considered and solved by extending the well-known proximal point algorithm. For monotone vector fields the algorithm will generate a well defined sequence, and for monotone vector fields with singularities it will converge to a singularity. It will be also shown how tools of convex analysis on Riemannian manifolds can solve non-convex constrained problems in Euclidean spaces. To illustrate this remarkable fact example… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
43
0

Year Published

2008
2008
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 101 publications
(43 citation statements)
references
References 15 publications
(17 reference statements)
0
43
0
Order By: Relevance
“…(see, for example [10]). Therefore, (R ++ , , ) is a Hadamard manifold and the unique geodesic x : R → R ++ with initial conditions x(0) = x 0 and x (0) = v is given by 1] ϕ(x, τ ), (17) and consider the problem…”
Section: Examplementioning
confidence: 99%
“…(see, for example [10]). Therefore, (R ++ , , ) is a Hadamard manifold and the unique geodesic x : R → R ++ with initial conditions x(0) = x 0 and x (0) = v is given by 1] ϕ(x, τ ), (17) and consider the problem…”
Section: Examplementioning
confidence: 99%
“…Thus (R n , G(x)) is a connected and complete finite dimensional Riemannian manifold with null sectional curvature, see [7]. The gradient of f is given by grad f (x) = G −1 (x)∇f (x) and the steepest descent iteration is…”
Section: A Steepest Descent Algorithm For R Nmentioning
confidence: 99%
“…where P S denotes the projection on S. Recall from [15] that the vector field X → 2(ln det X)X is monotone (but this vector field isn't monotone in the classical sense), and recall also from [46] that the vector field X → − exp 3), we see that A 2 satisfies a global weak sharp minima-like condition. Then, Corollary 3.5 is applied to A 1 to conclude that, with {λ n } satisfying sup n λ n < +∞, any sequence {x n }, generated by IP 1 with n δ n < ∞, or by Algorithm IP 2 with n δ 2 n < ∞, converges linearly to a pointx ∈ A −1 1 (0), and {x n } is superlinearly convergent if lim n→∞ λ n = 0, while Corollary 4.7 is applied to A 2 to conclude that any sequence {x n }, generated by Algorithm IP 2 with {δ n } satisfying δ 2 n < +∞ and the parameters {λ n } satisfying (4.15), terminates in a finite number of iterations.…”
Section: Numerical Examplesmentioning
confidence: 92%