Applications and Science in Soft Computing 2004
DOI: 10.1007/978-3-540-45240-9_1
|View full text |Cite
|
Sign up to set email alerts
|

A Recurrent Self-Organizing Map for Temporal Sequence Processing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2009
2009
2023
2023

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 1 publication
0
8
0
Order By: Relevance
“…Some other works [3] focused on setting up an on-map representations to inputs, and used the difference in representations on the map to compare input temporal sequences. Reservoir computing approach [4] aims basically to set up a mapping between the input phase space and the reservoir state space.…”
Section: Introductionmentioning
confidence: 99%
“…Some other works [3] focused on setting up an on-map representations to inputs, and used the difference in representations on the map to compare input temporal sequences. Reservoir computing approach [4] aims basically to set up a mapping between the input phase space and the reservoir state space.…”
Section: Introductionmentioning
confidence: 99%
“…In contrast with the majority of supervised connectionist models in the literature, STORM is based on an unsupervised recurrent SOM [15] and operates using a discrete state-space.…”
Section: Discussionmentioning
confidence: 99%
“…STORM's predictions are made by utilizing the locational representational values used in its context vector. As further explained in [15], the winning neuron for an input is the neuron whose weight vector best matches both the input symbol and the context representation of the last winning neuron's location. STORM predicts the next symbol by finding the neuron whose context representation best matches that of the current winning neuron (i.e.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations