2022
DOI: 10.3390/math10213959
|View full text |Cite
|
Sign up to set email alerts
|

Relaxation Subgradient Algorithms with Machine Learning Procedures

Abstract: In the modern digital economy, optimal decision support systems, as well as machine learning systems, are becoming an integral part of production processes. Artificial neural network training as well as other engineering problems generate such problems of high dimension that are difficult to solve with traditional gradient or conjugate gradient methods. Relaxation subgradient minimization methods (RSMMs) construct a descent direction that forms an obtuse angle with all subgradients of the current minimum neigh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

3
3

Authors

Journals

citations
Cited by 6 publications
(20 citation statements)
references
References 39 publications
0
15
0
Order By: Relevance
“…A total of 12 papers were submitted to this Special Issue, of which 11 were published (91.67%) [11][12][13][14][15][16][17][18][19][20][21] and only 1 was rejected (8.33%), indicating the very high quality of the original submissions.…”
Section: Statistics Of the Special Issuementioning
confidence: 99%
See 1 more Smart Citation
“…A total of 12 papers were submitted to this Special Issue, of which 11 were published (91.67%) [11][12][13][14][15][16][17][18][19][20][21] and only 1 was rejected (8.33%), indicating the very high quality of the original submissions.…”
Section: Statistics Of the Special Issuementioning
confidence: 99%
“…The study by Krutikov et al [18] is dedicated to a new relaxation subgradient minimization method (RSMM). The computational experiments conducted by the authors confirmed the effectiveness of the proposed algorithm, showing that it outperforms currently known methods.…”
Section: Overview Of the Contributions To The Special Issuementioning
confidence: 99%
“…This research is a continuation of previous studies [27,28] and aimed at studying the capabilities of the Newton's method and the relaxation subgradient method with optimization of the parameters of rank-two correction of metric matrices [27] to eliminate the linear background that worsens the convergence in the conditions of the existence of transformation V with the properties noted above. Similar studies for quasi-Newton methods were carried out in [29].…”
Section: Introductionmentioning
confidence: 97%
“…The quantity M G determines the complexity of solving system (26). The transformation parameters of the metric matrices of the subgradient method are found according to expression (28).…”
Section: Let Us Introduce the Relation θ(M) And Its Inverse Function ...mentioning
confidence: 99%
“…The principle of organization in a number of the RSMs [70] is that, in a particular RSM, there is an independent algorithm for finding the descent direction, which makes it possible to go beyond some neighborhood of the current minimum. In [70,71], the problem of finding the descent direction in RSM was formulated as the problem of solving systems of inequalities on separable sets.…”
Section: Introductionmentioning
confidence: 99%