2021
DOI: 10.1137/20m1314057
|View full text |Cite
|
Sign up to set email alerts
|

Convergence Rate Analysis of a Sequential Convex Programming Method with Line Search for a Class of Constrained Difference-of-Convex Optimization Problems

Abstract: In this paper, we study the sequential convex programming method with monotone line search (SCP ls ) in [46] for a class of difference-of-convex (DC) optimization problems with multiple smooth inequality constraints. The SCP ls is a representative variant of moving-ball-approximationtype algorithms [6,10,13,54] for constrained optimization problems. We analyze the convergence rate of the sequence generated by SCP ls in both nonconvex and convex settings by imposing suitable Kurdyka-Lojasiewicz (KL) assumptions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
25
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 15 publications
(26 citation statements)
references
References 49 publications
1
25
0
Order By: Relevance
“…Specifically, we show that the extended objective 1 of a large class of Euclidean norm (and more generally, group LASSO penalty) regularized convex optimization problems is a KL function with exponent 1 2 . This complements the recent study in this direction [29,39,41,43], and, in particular, gives rise to linear convergence of our method in solving these structured optimization models.…”
Section: Introductionsupporting
confidence: 81%
See 3 more Smart Citations
“…Specifically, we show that the extended objective 1 of a large class of Euclidean norm (and more generally, group LASSO penalty) regularized convex optimization problems is a KL function with exponent 1 2 . This complements the recent study in this direction [29,39,41,43], and, in particular, gives rise to linear convergence of our method in solving these structured optimization models.…”
Section: Introductionsupporting
confidence: 81%
“…The next corollary deals with (5.1) with m = 1. Its proof follows a similar line of arguments as the proof of [41,Corollary 4.3]. We omit the proof for brevity.…”
Section: Explicit Kurdyka-lojasiewicz Exponent For Some Structured Co...mentioning
confidence: 82%
See 2 more Smart Citations
“…So this problem corresponds to (1.2) with P 1 and P 2 as in (1.6) and q = P 1 − P 2 . In the literature, algorithms for solving (1.3) with 1 norm or p quasi-norm in place of the quotient of the 1 and 2 norms have been discussed in [5,18,33], and [41] discussed an algorithm for solving (1.4) with 1 norm in place of the quotient of the 1 and 2 norms. These existing algorithms, however, are not directly applicable for solving (1.2) due to the fractional objective and the possibly nonsmooth continuous function q in the constraint.…”
mentioning
confidence: 99%