2022
DOI: 10.1134/s1990478922030073
|View full text |Cite
|
Sign up to set email alerts
|

Optimization of Subgradient Method Parameters Based on Rank-Two Correction of Metric Matrices

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
31
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(31 citation statements)
references
References 15 publications
0
31
0
Order By: Relevance
“…This research is a continuation of previous studies [27,28] and aimed at studying the capabilities of the Newton's method and the relaxation subgradient method with optimization of the parameters of rank-two correction of metric matrices [27] to eliminate the linear background that worsens the convergence in the conditions of the existence of transformation V with the properties noted above. Similar studies for quasi-Newton methods were carried out in [29].…”
Section: Introductionmentioning
confidence: 95%
See 3 more Smart Citations
“…This research is a continuation of previous studies [27,28] and aimed at studying the capabilities of the Newton's method and the relaxation subgradient method with optimization of the parameters of rank-two correction of metric matrices [27] to eliminate the linear background that worsens the convergence in the conditions of the existence of transformation V with the properties noted above. Similar studies for quasi-Newton methods were carried out in [29].…”
Section: Introductionmentioning
confidence: 95%
“…In what follows, this estimate serves as a standard, and the ability of a certain method, like Newton's method, to exclude linear background will be called its Newtonian property. The main goal of the work is to substantiate the presence of the Newtonian property in RSM with a change in the space metric [27]. As shown in [29], the noted Newtonian property is inherent in quasi-Newton methods.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Embedding the ideas of machine learning theory [25] into such optimization methods made it possible to identify the principles of organizing RSMM with space dilation [26][27][28][29]. The problem of finding the descent direction in the RSMM can be reduced to the problem of solving a system of inequalities on subgradient sets, mathematically formulated as a problem of minimizing a quality functional.…”
Section: Introductionmentioning
confidence: 99%