2020
DOI: 10.1007/s10957-020-01636-7
|View full text |Cite
|
Sign up to set email alerts
|

A Modified Nonlinear Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…In [37], a modified HS CG method for nonsmooth convex optimization problems was proposed, whose numerical efficiency was verified via high-dimensional training samples. Furthermore, in [38], a modified CG method that inherits the advantages of both HS and DY CG methods was constructed for a nonsmooth optimization problem. It is noted that the works in [37,38] merely addressed the convergence of the CG methods, while they ignored their convergence rates.…”
Section: References Nomentioning
confidence: 99%
See 1 more Smart Citation
“…In [37], a modified HS CG method for nonsmooth convex optimization problems was proposed, whose numerical efficiency was verified via high-dimensional training samples. Furthermore, in [38], a modified CG method that inherits the advantages of both HS and DY CG methods was constructed for a nonsmooth optimization problem. It is noted that the works in [37,38] merely addressed the convergence of the CG methods, while they ignored their convergence rates.…”
Section: References Nomentioning
confidence: 99%
“…Furthermore, in [38], a modified CG method that inherits the advantages of both HS and DY CG methods was constructed for a nonsmooth optimization problem. It is noted that the works in [37,38] merely addressed the convergence of the CG methods, while they ignored their convergence rates. Thus, this paper aims to develop a modified CG method for the Wasserstein distributionally robust LR model under the FOBOS framework, which can not only prove its convergence but also estimates its convergence rate.…”
Section: References Nomentioning
confidence: 99%
“…In addition to their original authors, the issue of global convergence of methods (5) has also been investigated by some researchers like Al-Baali [40] and Gilbert and Nocedal [41]. Likewise, for all the CG directions that are presented in previous paragraph, the authors proved global convergence under necessary line searches techniques such as Armijo [14,16,20,29], week Wolfe-Powell [15-18, 21, 23, 24, 26, 27, 30, 35], strong Wolfe-Powell [12,16,19,22,25,28,31], modifications of these three techniques [13,[32][33][34] or some backtracking algorithms [36][37][38][39].…”
Section: Introductionmentioning
confidence: 96%
“…For example, interested readers can see some modifications of HS method in [12,13], several combinations of FR method in [14][15][16], various developments of PRP method in [17][18][19][20][21], an extended LS method in [22] and variant improvements of DY method in [23][24][25]. Furthermore, some researchers used techniques like quasi-Newton [26][27][28], regularization [29,30], a combination of above methods [31][32][33] or alternative techniques [34,35] and introduced appropriate CG methods to solve optimization problems. Similarly, there exist plenty of CG algorithms that are created to solve systems of nonlinear equations [36][37][38][39].…”
Section: Introductionmentioning
confidence: 99%
“…For example, interested readers can see some modifcations of the HS method in the study by Faramarzi and Amini [11] and Hu et al [12], several combinations of the FR method in the work by Abubakar et al [13] and Sakai and Iiduka [14], various developments of the PRP method in the study by Mishra et al [15], Wu [16], and Andrei [17], an extended LS method in [18], and variant improvements of the DY method in the study by Deepho et al [19], Zhu et al [20], and Jiang and Jian [21]. Furthermore, some researchers used techniques such as quasi-Newton [22,23], regularization [24][25][26], a combination of above methods [27,28], or alternative techniques [29,30] and introduced appropriate CG methods to solve optimization problems. To discuss the CG methods in more detail, the readers can see [31].…”
Section: Introductionmentioning
confidence: 99%