1996
DOI: 10.1109/72.548174
|View full text |Cite
|
Sign up to set email alerts
|

Incremental learning of complex temporal patterns

Abstract: A neural model for temporal pattern generation is used and analyzed for training with multiple complex sequences in a sequential manner. The network exhibits some degree of interference when new sequences are acquired. It is proven that the model is capable of incrementally learning a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. While the number of intact sequences increases linearly with the number of previously acquired sequences, the amount… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0
1

Year Published

2000
2000
2011
2011

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 50 publications
(9 citation statements)
references
References 34 publications
0
8
0
1
Order By: Relevance
“…Such a goal might be possible by utilizing a hierarchy while learning patterns, as was done by George et al [22] and Serre et al [23] to successfully learn to classify sets of patterns with a considerable variance. In addition to spatial patterns, temporal patterns can be learned as well, like Wang et al did in learning a finite number of sequences [24]. Temporal and abstract patterns would prove useful in a lifelong learning framework, since such generalizations would make patterns more broadly applicable and would allow the transfer of knowledge among a larger set of tasks.…”
Section: Future Workmentioning
confidence: 98%
“…Such a goal might be possible by utilizing a hierarchy while learning patterns, as was done by George et al [22] and Serre et al [23] to successfully learn to classify sets of patterns with a considerable variance. In addition to spatial patterns, temporal patterns can be learned as well, like Wang et al did in learning a finite number of sequences [24]. Temporal and abstract patterns would prove useful in a lifelong learning framework, since such generalizations would make patterns more broadly applicable and would allow the transfer of knowledge among a larger set of tasks.…”
Section: Future Workmentioning
confidence: 98%
“…These studies were mainly concerned with the learning and storage of a single complex sequence. Wang and Yuwono (1996) extended the anticipation mechanism to allow learning of multiple complex sequences presented in a sequential manner to the model.…”
Section: Conclusion and Further Workmentioning
confidence: 99%
“…In fact, incremental training is widely used in practice, either in lieu of or in combination with batch training, because it can handle large training sets that are otherwise prohibitive even for moderate-size networks [23]- [27]. Moreover, incremental training allows to train neural networks online, assimilating training samples that only become available one at a time incrementally over time.…”
Section: Background On Supervised Neural Network Trainingmentioning
confidence: 99%
“…A well-known result in control theory states that when the cost function is quadratic and the dynamic equation is linear, the optimal control law takes the form (23) where the system matrix , the weighting matrices and , and the Riccati matrix , defined according to [28], are all known. Using (23), it is possible to obtain a set of longitudinal and lateral control gains that are locally optimal in and are scheduled by , i.e., [16]. This set is used to form the LTM training set in Section V-D.…”
Section: ) Classical Gain-scheduled Flight Controllers (Or Ltm)mentioning
confidence: 99%