2018
DOI: 10.1007/s10107-018-1318-9
|View full text |Cite
|
Sign up to set email alerts
|

Enhanced proximal DC algorithms with extrapolation for a class of structured nonsmooth DC minimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
23
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 37 publications
(25 citation statements)
references
References 16 publications
2
23
0
Order By: Relevance
“…In order to verify this property, we essentially assume that ϕ ↑ , ϕ ↓ , ψ 1 , and ψ 2 are differentiable with locally Lipschitz continuous gradients near Z ∞ . This type of assumptions have also been made in [32,57,36] to prove the sequential convergence of the dc algorithm (with extrapolation) for solving the dc program: minimize…”
Section: 1mentioning
confidence: 99%
“…In order to verify this property, we essentially assume that ϕ ↑ , ϕ ↓ , ψ 1 , and ψ 2 are differentiable with locally Lipschitz continuous gradients near Z ∞ . This type of assumptions have also been made in [32,57,36] to prove the sequential convergence of the dc algorithm (with extrapolation) for solving the dc program: minimize…”
Section: 1mentioning
confidence: 99%
“…algorithm has been further developed for improving the quality of solutions and the rate of convergence [30,31,36,48]. Most existing DC algorithms were proved to be subsequential convergent to a critical point of DC minimization for the case that g is nonsmooth [2,17,48].…”
Section: Introductionmentioning
confidence: 99%
“…When the subtracted function is defined by g = max 1≤i≤I ψ i (x) with convex and continuously differentiable ψ i , by exploiting the structure of the subtracted function in DC minimization, Pang, Razaviyayn, and Alvarado [36] proposed an enhanced DC algorithm to solve DC minimization with subsequential convergence to a d-stationary point of the considered problem. Further, Lu, Zhou and Sun [30,31] combined the enhanced DC algorithm with some possible accelerated methods to design some DC algorithms with subsequential convergence to the d-stationary points [30,31]. In problem (2), the subtracted part is the maximum of some convex functions.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Difference-of-convex (DC) programs are a class of important optimization problems, which generally minimize an objective function that is the difference of two convex functions subject to constraints defined by the same type of functions. They have been studied for several decades in the literature (e.g., see [10,17,11,25,28,20,16,13] and references therein). In this paper we are interested in a DC program in the form of min x∈X F (x) = φ 0 (x) + ζ 0 (x) − ψ 0 (x) s.t.…”
Section: Introductionmentioning
confidence: 99%