2009 First International Workshop on Education Technology and Computer Science 2009
DOI: 10.1109/etcs.2009.63
|View full text |Cite
|
Sign up to set email alerts
|

The Prediction-Correction and Relaxed Hybrid Steepest-Descent Method for Variational Inequalities

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2013
2013
2014
2014

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 17 publications
0
5
0
Order By: Relevance
“…In this paper, we will prove the strong convergence of PRH method under different and suitable restrictions imposed on parameters (Condition 12), which differs from that of [15]. Moreover, the proof of strong convergence is different from the previous proof in [15], which is not similar to that in [7] in Step 2.…”
Section: Introductionmentioning
confidence: 87%
See 4 more Smart Citations
“…In this paper, we will prove the strong convergence of PRH method under different and suitable restrictions imposed on parameters (Condition 12), which differs from that of [15]. Moreover, the proof of strong convergence is different from the previous proof in [15], which is not similar to that in [7] in Step 2.…”
Section: Introductionmentioning
confidence: 87%
“…In this paper, we will prove the strong convergence of PRH method under different and suitable restrictions imposed on parameters (Condition 12), which differs from that of [15]. Moreover, the proof of strong convergence is different from the previous proof in [15], which is not similar to that in [7] in Step 2. And more importantly, numerical experiments verify that the PRH method under Condition 12 is more efficient than that under Condition 10, and the PRH method under some descent directions is more slightly efficient than that of the MRHSD method [14,16].…”
Section: Introductionmentioning
confidence: 87%
See 3 more Smart Citations