IJCNN-91-Seattle International Joint Conference on Neural Networks
DOI: 10.1109/ijcnn.1991.155319
|View full text |Cite
|
Sign up to set email alerts
|

Continuous-time temporal back-propagation

Abstract: The back-propagation training technique for feed-forward neural networks is extended to networks with delay elements in the connections. Such an extension has previously been described for discrete-time presentation of training patterns and weight updates [1]. Here we present a continuous-time generalization of these results. Such networks can be used for temporal and spatio-temporal pattern recognition.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…Similarly to time-delay neural networks (Day & Davenport, 1993;Lin et al, 1993) each connection in the DRAMA network has two parameters associated with it instead of one: a time parameter (tp) and a confidence factor (cf) (see Figure 4 left). Time parameters and confidence factors are positive numbers (real numbers in the simulation and integers in the physical implementations).…”
Section: Associative Module (Drama)mentioning
confidence: 99%
“…Similarly to time-delay neural networks (Day & Davenport, 1993;Lin et al, 1993) each connection in the DRAMA network has two parameters associated with it instead of one: a time parameter (tp) and a confidence factor (cf) (see Figure 4 left). Time parameters and confidence factors are positive numbers (real numbers in the simulation and integers in the physical implementations).…”
Section: Associative Module (Drama)mentioning
confidence: 99%