2016
DOI: 10.2306/scienceasia1513-1874.2016.42.040
|View full text |Cite
|
Sign up to set email alerts
|

The smoothing Fletcher-Reeves conjugate gradient method for solving finite minimax problems

Abstract: ABSTRACT:In this paper, we give a smoothing Fletcher-Reeves conjugate gradient method for finite minimax problems. The functions of the finite minimax problem are all continuous differentiable functions. Under general conditions, we present the global convergence of the method. The final discussion and preliminary numerical experiments indicate that the method works quite well in practice.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 11 publications
0
9
0
Order By: Relevance
“…In this section, we present the smoothing modified three-term conjugate gradient method to solve problem ( 1 ). Firstly, we give the definition of smoothing function and smoothing approximation function of the absolute value function [ 14 , 15 , 29 ].…”
Section: Main Results and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we present the smoothing modified three-term conjugate gradient method to solve problem ( 1 ). Firstly, we give the definition of smoothing function and smoothing approximation function of the absolute value function [ 14 , 15 , 29 ].…”
Section: Main Results and Discussionmentioning
confidence: 99%
“…The complementarity problem, the absolute value equation problem, and the related constrained optimization problem are three kinds of important optimization problems [ 19 23 ]. On the other hand, the nonlinear conjugate gradient methods and smoothing methods are used widely to solve large-scale optimization problems [ 24 , 25 ], total variation image restoration [ 26 ], monotone nonlinear equations with convex constraints [ 27 ], and nonsmooth optimization problems, such as nonsmooth nonconvex problems [ 28 ], minimax problem [ 29 ], P0 nonlinear complementarity problems [ 30 ]. Specially, the effectiveness of widely used and attained different numerical outcomes three-term conjugate gradient method, which is based on Hang–Zhang conjugate gradient method and Polak–Ribière–Polyak conjugate gradient method [ 31 33 ], has been widely studied.…”
Section: Introductionmentioning
confidence: 99%
“…Firstly, we give the definition of smoothing function and the smoothing approximation function of the absolute value function; one can see [15,26,27]. …”
Section: The Smoothing Fr Conjugate Gradient Methodsmentioning
confidence: 99%
“…On the other hand, the conjugate gradient method is suitable for solving large-scare optimization problems and has sample structure and global convergence [19][20][21][22][23][24][25]. In addition, the smoothing methods are used to solve the related nonsmooth optimization problems, such as [26][27][28] and the references therein. Therefore, based on the above analysis, we present a new smoothing FR conjugate gradient method to solve (3); this is also our motivation to write this paper.…”
Section: Introductionmentioning
confidence: 99%
“…The finite minimax problem is one of the most important research topics in the field of optimization research [1][2][3][4][5][6][7]. This kind of problem is also an important nonsmooth optimization problem, which is widely used in engineering design, economic decision-making, game theory, nonlinear programming problems and multi-objective programming problems (such as [8][9][10] and the references therein).…”
Section: Introductionmentioning
confidence: 99%