2023
DOI: 10.1109/tcyb.2022.3155901
|View full text |Cite
|
Sign up to set email alerts
|

Growing Echo State Network With an Inverse-Free Weight Update Strategy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(1 citation statement)
references
References 47 publications
0
1
0
Order By: Relevance
“…Li et al [12] designed reservoirs with predefined sparsity and singular values. Chen et al [13] constructed an incremental inverse-free ESN to obtain weights according to the information of the previous optimal reservoir state and the freshly added neurons in the reservoir. In order to update the input weights of simple cycle reservoir network (SCRN), Wang et al [14] injected the pre-trained output weight matrix into the input weight matrix, thus enabling the modification of the network weights.…”
Section: Introductionmentioning
confidence: 99%
“…Li et al [12] designed reservoirs with predefined sparsity and singular values. Chen et al [13] constructed an incremental inverse-free ESN to obtain weights according to the information of the previous optimal reservoir state and the freshly added neurons in the reservoir. In order to update the input weights of simple cycle reservoir network (SCRN), Wang et al [14] injected the pre-trained output weight matrix into the input weight matrix, thus enabling the modification of the network weights.…”
Section: Introductionmentioning
confidence: 99%