2020
DOI: 10.1016/j.neucom.2020.07.034
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing echo state network through a novel fisher maximization based stochastic gradient descent

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(2 citation statements)
references
References 34 publications
0
2
0
Order By: Relevance
“…Similar to the simplest artificial neural network, traditional reservoir computing comprises three basic parts 50 : the input layer, intermediate layer (reservoir), and output layer, as illustrated in Fig. 2 a.…”
Section: Methodsmentioning
confidence: 99%
“…Similar to the simplest artificial neural network, traditional reservoir computing comprises three basic parts 50 : the input layer, intermediate layer (reservoir), and output layer, as illustrated in Fig. 2 a.…”
Section: Methodsmentioning
confidence: 99%
“…However, the second stage is optimization of the nonlinear functional. To find the optimal solution, we use gradient descent [27,28] and penalty functions [29][30][31]. Then problem (10,18,22,23,28,29) takes the form (18,22,23,(30)(31)(32)(33)(34)36) with the iteration rule: The stopping criterion will be considered the achievement of such a value of k so that the inequality Let us consider formulas (30)(31)(32)(33)(34)(35)) in more detail.…”
Section: Stagementioning
confidence: 99%