2008
DOI: 10.1007/978-3-540-88636-5_34
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection Using Artificial Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 7 publications
0
7
0
Order By: Relevance
“…Both spatial and spectral resolutions of the sensors have increased; thus, the selection of the most appropriate data as inputs, known as feature selection, has become a more critical issue, particularly for neural networks. For this purpose, the pruning of neural networks has been suggested as an alternative to existing statistical methods [12][13][14][15].…”
Section: Introductionmentioning
confidence: 99%
“…Both spatial and spectral resolutions of the sensors have increased; thus, the selection of the most appropriate data as inputs, known as feature selection, has become a more critical issue, particularly for neural networks. For this purpose, the pruning of neural networks has been suggested as an alternative to existing statistical methods [12][13][14][15].…”
Section: Introductionmentioning
confidence: 99%
“…When used with a large number of features and layers, these models are difficult to interpret as the estimated parameters of the model, known as weights, are not directly convertible to a meaningful measure of relevance. However, given our simple design, the MLP can be used effectively for feature selection (59). To determine the importance of each feature, the dependent resampled input method was used (60).…”
Section: Top Featuresmentioning
confidence: 99%
“…Speeding up a data mining algorithm, improving the data quality, improving the performance of data mining and increasing the clarity of the mining results are the significance of feature selection methods. The key benefits of feature selections are [2,3,6]:- Reduce Overfitting  Improves Accuracy  Reduce Training Time…”
Section: Feature Selectionmentioning
confidence: 99%