DOI: 10.29007/8559
|View full text |Cite
|
Sign up to set email alerts
|

Implementation of Incremental Learning in Artificial Neural Networks

Abstract: Nowadays, the use of artificial neural networks (ANN), in particular the Multilayer Perceptron (MLP), is very popular for executing different tasks such as pattern recognition, data mining, and process automation. However, there are still weaknesses in these models when compared with human capabilities. A characteristic of human memory is the ability for learning new concepts without forgetting what we learned in the past, which has been a disadvantage in the field of artificial neural networks. How can we add… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 11 publications
(12 reference statements)
0
6
0
Order By: Relevance
“…As the number of classes grows, this requirement becomes infeasible. Incremental learning is a learning method where the learning process takes place when new classes emerge and the existing knowledge is adjusted based on what has been newly learnt (Andrade et al, 2017). The entire training set is not needed before the learning process, and the new classes can appear over time.…”
Section: Need For Incremental Learningmentioning
confidence: 99%
“…As the number of classes grows, this requirement becomes infeasible. Incremental learning is a learning method where the learning process takes place when new classes emerge and the existing knowledge is adjusted based on what has been newly learnt (Andrade et al, 2017). The entire training set is not needed before the learning process, and the new classes can appear over time.…”
Section: Need For Incremental Learningmentioning
confidence: 99%
“…adjustment. Andrade et al (2017), Castro et al (2018, and Rusu et al (2016) investigate adapting neural networks to new data with additional classes or even new tasks, requiring to change the structure of the neural network. Our setting is less complex as the neural network is trained on all possible classes from the beginning.…”
Section: Continuous Model Adjustmentmentioning
confidence: 99%
“…The incremental learning capability of the ANN grants keeping the information acquired at the beginning of the search and enhances the surrogate quality over the new regions of interest [49].…”
Section: Surrogate Building Using Mcdropout-based Bnnmentioning
confidence: 99%