2020
DOI: 10.1016/j.neucom.2020.03.024
|View full text |Cite
|
Sign up to set email alerts
|

Prevention of catastrophic interference and imposing active forgetting with generative methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 66 publications
0
6
0
Order By: Relevance
“…If we compare the local learning of the radial basis function network with the global learning technique of the feed-forward neural network, the latter suffers from catastrophic forgetting. Kirkpatrick et al (2017) and Sukhov et al (2020) look at ways of improving this issue, specifically at training networks that can maintain expertise on tasks that they have not experienced for a long time. The radial basis function network that we formulate, is naturally designed to measure the similarity between test samples and continuously updated prototypes that capture the characteristics of the feature space.…”
Section: Discussionmentioning
confidence: 99%
“…If we compare the local learning of the radial basis function network with the global learning technique of the feed-forward neural network, the latter suffers from catastrophic forgetting. Kirkpatrick et al (2017) and Sukhov et al (2020) look at ways of improving this issue, specifically at training networks that can maintain expertise on tasks that they have not experienced for a long time. The radial basis function network that we formulate, is naturally designed to measure the similarity between test samples and continuously updated prototypes that capture the characteristics of the feature space.…”
Section: Discussionmentioning
confidence: 99%
“…If we compare the local learning of the radial basis function network with the global learning technique of the feed-forward neural network, the latter suffers from catastrophic forgetting. Kirkpatrick et al (2017) and Sukhov et al (2020) look at ways of improving this issue, specifically at training networks that can maintain expertise on tasks that they have not experienced for a long time. The radial basis function networks that we formulate are naturally designed to measure the similarity between test samples and continuously updated prototypes that capture the characteristics of the feature space.…”
Section: Discussionmentioning
confidence: 99%
“…However, with increasing runtime of the framework, the amount of data collected from the use case increases. This leads to large data sets that do not necessarily contribute to good performance of the overall system, as the information may become outdated [60], [61]. Hence, it is useful to develop a strategy on how to discard or aggregate the increasing amount of data.…”
Section: Discussionmentioning
confidence: 99%