2020
DOI: 10.1007/s00521-020-04718-9
|View full text |Cite
|
Sign up to set email alerts
|

A Randomized Block-Coordinate Adam online learning optimization algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 27 publications
(12 citation statements)
references
References 22 publications
0
12
0
Order By: Relevance
“…The loss function is cross-entropy loss function and using the Adam optimizer [ 44 ]. In the training set, there were approximately 20 times more moso bamboo OG than moso bamboo non-OG, which led to an imbalance problem during training.…”
Section: Methodsmentioning
confidence: 99%
“…The loss function is cross-entropy loss function and using the Adam optimizer [ 44 ]. In the training set, there were approximately 20 times more moso bamboo OG than moso bamboo non-OG, which led to an imbalance problem during training.…”
Section: Methodsmentioning
confidence: 99%
“…In this learning model, the optimizer chooses the Adam optimizer [15,16]. The training batch size was set to 30.…”
Section: Trainingmentioning
confidence: 99%
“…8. Finally, we use an adaptive moment estimation (Adam) algorithm [30] to optimize the loss function () L  . It should be noted that the estimated noise level  is related to the predicted value ˆi y .…”
Section: A Multi-scale Densely Connected Noise Estimation Modulementioning
confidence: 99%