The platform will undergo maintenance on Sep 14 at about 9:30 AM EST and will be unavailable for approximately 1 hour.
2019
DOI: 10.1515/math-2019-0011
|View full text |Cite
|
Sign up to set email alerts
|

A new smoothing method for solving nonlinear complementarity problems

Abstract: In this paper, a new improved smoothing Newton algorithm for the nonlinear complementarity problem was proposed. This method has two-fold advantages. First, compared with the classical smoothing Newton method, our proposed method needn’t nonsingular of the smoothing approximation function; second, the method also inherits the advantage of the classical smoothing Newton method, it only needs to solve one linear system of equations at each iteration. Without the need of strict complementarity conditions and the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 49 publications
(56 reference statements)
0
8
0
Order By: Relevance
“…The first, a nonsmooth quasi-newton method proposed in [5] which we call Algorithm 2. The second, a smoothing Jacobian method proposed in [18] which, unlike our proposal, uses a smoothing of the Fischer function (Algorithm 1 with λ = 2), which Algorithm 3 and, the third, a smooth Newton method proposed recently in [32], which we call Algorithm 4. We vary λ in two forms obtaining two versions of our algorithm, namely, Method 1: we use the dynamic choice of λ used in [5], (this strategy combines the efficiency of Fisher function far from the solution with that of the minimum function near to it), Method 2: we vary randomly λ in the interval (0, 4) .…”
Section: Numerical Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The first, a nonsmooth quasi-newton method proposed in [5] which we call Algorithm 2. The second, a smoothing Jacobian method proposed in [18] which, unlike our proposal, uses a smoothing of the Fischer function (Algorithm 1 with λ = 2), which Algorithm 3 and, the third, a smooth Newton method proposed recently in [32], which we call Algorithm 4. We vary λ in two forms obtaining two versions of our algorithm, namely, Method 1: we use the dynamic choice of λ used in [5], (this strategy combines the efficiency of Fisher function far from the solution with that of the minimum function near to it), Method 2: we vary randomly λ in the interval (0, 4) .…”
Section: Numerical Resultsmentioning
confidence: 99%
“…For the numerical test, we consider nine complementarity problems associated with the functions Kojima-Shindo (Koj-Shi), Kojima-Josephy (Koj-Jo), Mathiesen modificado (Math mod), Mathiesen (Mathiesen) Billups (Billups) [7], [25]; Nash-Cournot (Nash-Co) [16], Hock-Schittkowski (HH 66 ) [32], Geiger-Kanzow (Geiger-Kanzow) [15], Ahn (Ahn) [2]. We implemented Algorithms 1 (with Methods 1 and 2) and the test functions in MATLAB and use the following starting points taking from [5], [32],…”
Section: Numerical Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…This results in the fact that the reconstruction by using TGV regularization can preserve edges while suppressing staircase effect. In order to solve the TGV model efficiently, many optimization algorithms have been proposed, such as Newton's method, split Bregman method, alternating direction method of multipliers, and gradient descent method [35][36][37][38][39][40][41][42][43]. Experiments show that TGV has the superior performance to TV based regularization models in image reconstruction.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, the investigation of the regularity of discrete maximal operators has also attracted the attention of many authors (cf. [2,5,7,18,21,24,27,29,32,36,37]). Let us recall some definitions and background.…”
Section: Introductionmentioning
confidence: 99%