2022
DOI: 10.1109/tnnls.2020.3041755
|View full text |Cite
|
Sign up to set email alerts
|

Robust Stochastic Gradient Descent With Student-t Distribution Based First-Order Momentum

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 46 publications
(32 citation statements)
references
References 24 publications
(34 reference statements)
0
31
0
Order By: Relevance
“…For example, GD [19], SGD [16], RMSProp [36], and Adam [18] are typical first-order methods. Furthermore, Yogi [21], Fromage [22], diffGrad [23], and TAdam [24] also were designed based on the first-order methods.…”
Section: Overview Of Optimization Methods For Machine Learningmentioning
confidence: 99%
See 4 more Smart Citations
“…For example, GD [19], SGD [16], RMSProp [36], and Adam [18] are typical first-order methods. Furthermore, Yogi [21], Fromage [22], diffGrad [23], and TAdam [24] also were designed based on the first-order methods.…”
Section: Overview Of Optimization Methods For Machine Learningmentioning
confidence: 99%
“…Actually, there exist many methods to implement Equation (11). Among them, we compute Equation (11) according to the methods shown in [24] as…”
Section: Adaptive Coefficient Computation Methods For the Robust First Momentummentioning
confidence: 99%
See 3 more Smart Citations