2015
DOI: 10.1007/s10958-015-2482-6
|View full text |Cite
|
Sign up to set email alerts
|

Maximization of a Function with Lipschitz Continuous Gradient

Abstract: In the present paper, we consider (nonconvex in the general case) functions that have Lipschitz continuous gradient. We prove that the level sets of such functions are proximally smooth and obtain an estimate for the constant of proximal smoothness. We prove that the problem of maximization of such function on a strongly convex set has a unique solution if the radius of strong convexity of the set is sufficiently small. The projection algorithm (similar to the gradient projection algorithm for minimization of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(3 citation statements)
references
References 4 publications
0
3
0
Order By: Relevance
“…But indeed the solutions of these two problems coincide under condition (24). The proof of Theorem 4 follows from the following fact regarding strongly convex sets of radius r and functions with the Lipschitz continuous gradient [3].…”
Section: Full-step Frank-wolfe Methods (Ffw)mentioning
confidence: 87%
See 1 more Smart Citation
“…But indeed the solutions of these two problems coincide under condition (24). The proof of Theorem 4 follows from the following fact regarding strongly convex sets of radius r and functions with the Lipschitz continuous gradient [3].…”
Section: Full-step Frank-wolfe Methods (Ffw)mentioning
confidence: 87%
“…As an example we consider a quadratic form on the unit sphere. (3) For approximately linear objective functions we propose a new version of the Frank-Wolfe (conditional gradient) method and establish its linear convergence to a global minimum in problem (1). We prove linear convergence of the method for a surface, which is the boundary of a strongly convex set.…”
Section: Introductionmentioning
confidence: 98%
“…If f is only convex (but not necessarily strongly convex), the sequence {w k } converges weakly [20]. When K is strongly convex, the linear convergence of {w k } is obtained under additional conditions (too strong for our main application) in [5,6,15].…”
Section: The Gradient Projection Methodsmentioning
confidence: 99%