Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
IECON 2010 - 36th Annual Conference on IEEE Industrial Electronics Society 2010
DOI: 10.1109/iecon.2010.5675075
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection using Sequential Forward Selection and classification applying Artificial Metaplasticity Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
74
0
1

Year Published

2012
2012
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 148 publications
(79 citation statements)
references
References 19 publications
0
74
0
1
Order By: Relevance
“…In the final step, the chosen subset is validated using domain knowledge or a validation set. Sequential and exponential search methods for subset generation include [5], [6], [14], [15]. Instead, we propose the use of a random search method, as in [16], [17], [18], and, in particular, of a multi-objective evolutionary algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…In the final step, the chosen subset is validated using domain knowledge or a validation set. Sequential and exponential search methods for subset generation include [5], [6], [14], [15]. Instead, we propose the use of a random search method, as in [16], [17], [18], and, in particular, of a multi-objective evolutionary algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…By selecting the best subsets of features, the accuracy of the overall system may increase (see e.g. (Marcano-Cede ando et al, 2010)). Finally, we plan to use compression methods like PC A and GPLV M to maximize the overall accuracy similar to what is described in (Zhong et al, 2008).…”
Section: Discussionmentioning
confidence: 99%
“…The feature selection process looks for and selects the optimal subset of feature that can effect on the whole contents of the dataset with the minimum error and information loss [13]. Feature selection approaches can be classified into three major classes based on the method of the search and selection method: complete, stochasticand heuristic search [14].…”
Section: Feature Selectionmentioning
confidence: 99%
“…In [14], a multilayer Feed Forward Neural Network (FFNN) affords the criterion of selection. The suggested method can provide effective outcomes in terms of search pace and accuracy when working with the classification technique.…”
Section: Feature Selectionmentioning
confidence: 99%