The 2011 International Joint Conference on Neural Networks 2011
DOI: 10.1109/ijcnn.2011.6033567
|View full text |Cite
|
Sign up to set email alerts
|

PCA and Gaussian noise in MLP neural network training improve generalization in problems with small and unbalanced data sets

Abstract: Machine learning approaches have been successfully applied for automatic decision support in several domains. The quality of these systems, however, degrades severely in classification problems with small and unbalanced data sets for knowledge acquisition. Inherent to several realworld problems, data sets with these characteristics are the reality to be tackled by learning algorithms, but the small amount of data affects the classifiers' generalization power while the imbalance in class distribution makes the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0
2

Year Published

2012
2012
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 21 publications
(8 citation statements)
references
References 16 publications
0
5
0
2
Order By: Relevance
“…In order to prevent over fitting, sufficient number of training data are essential. Since training data were limited due to experimental restrictions, Gaussian noise exertion on real data 25 and k-fold cross validation method 26 were applied. Finally, Gaussian noise method was selected according to the training error.…”
Section: Theoretical Basismentioning
confidence: 99%
“…In order to prevent over fitting, sufficient number of training data are essential. Since training data were limited due to experimental restrictions, Gaussian noise exertion on real data 25 and k-fold cross validation method 26 were applied. Finally, Gaussian noise method was selected according to the training error.…”
Section: Theoretical Basismentioning
confidence: 99%
“…Se ha demostrado que una red MLP con una capa oculta y la función sigmoidal es un aproximador universal de cualquier función continua en un conjunto compacto. Las redes neuronales MLP pueden trabajar como cualquier clasificador no lineal o como una función de regresión [12], [13] y [14].…”
Section: Red Perceptron Multicapa (Mlp)unclassified
“…BPNNs in remotely sensed image classification applications have been widely reviewed [41][42][43]. They were also reported as very effective to use in noise reduction [44,45] and robust to noises when trained by noise data [46]. These researchers have mutually concluded that the BPNN approach is feasible for the classification of remote sensing imagery.…”
Section: Introductionmentioning
confidence: 99%