2010
DOI: 10.1007/s13042-010-0004-x
|View full text |Cite
|
Sign up to set email alerts
|

Genetic Algorithm-Neural Network (GANN): a study of neural network activation functions and depth of genetic algorithm search applied to feature selection

Abstract: Hybrid genetic algorithms (GA) and artificial neural networks (ANN) are not new in the machine learning culture. Such hybrid systems have been shown to be very successful in classification and prediction problems. However, little attention has been focused on this architecture as a feature selection method and the consequent significance of the ANN activation function and the number of GA evaluations on the feature selection performance. The activation function is one of the core components of the ANN architec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
32
0
1

Year Published

2011
2011
2022
2022

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 129 publications
(36 citation statements)
references
References 15 publications
1
32
0
1
Order By: Relevance
“…On the other hand, a matrix-based representation of weights, where a column-wise and a row-wise crossover operators were also defined [152]. The GA-based real coded weights optimization outperforms BP and its variants for solving real-world applications [153][154][155][156]. Moreover, an evolutionary inspired algorithm, called differential evolution (DE) [105,157] that imitates mutation and crossover operator to solve complex continuous optimization problems was found to be performing efficiently for real-valued weight vector optimization [158][159][160].…”
Section: Weight Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…On the other hand, a matrix-based representation of weights, where a column-wise and a row-wise crossover operators were also defined [152]. The GA-based real coded weights optimization outperforms BP and its variants for solving real-world applications [153][154][155][156]. Moreover, an evolutionary inspired algorithm, called differential evolution (DE) [105,157] that imitates mutation and crossover operator to solve complex continuous optimization problems was found to be performing efficiently for real-valued weight vector optimization [158][159][160].…”
Section: Weight Optimizationmentioning
confidence: 99%
“…Primarily, node optimization can be addressed in three ways: 1) by choosing activation functions at the FNN active nodes from a set of activation functions [156,231]; 2) by optimizing the arguments of activation function [232]; and 3) by placing a complete model at the nodes of a network [233,234].…”
Section: Node Optimizationmentioning
confidence: 99%
“…All these algorithms are well-known in the field of optimization and these algorithms show their applications in various fields of engineering where problems are much complex and multidimensional. Some more techniques based on neural network and genetic algorithms [52][53][54] also proposed in recent years. In same way colour images have a histogram in 3-D and storing this information in a 3-D array and selecting multilevel threshold value for a 3-D array is a complex task.…”
Section: Introductionmentioning
confidence: 99%
“…There are other measures such as the area under ROC curve (AUC) or F-measure (based on precision and recall); recently, Wang and Dong [44] applied fuzzy entropy maximization instead of training error minimization to refine fuzzy IF-THEN rules. Tong and Mitram [39] use genetic algorithms to choose between neural network activation functions and features to increase classifier performance. Wang [43] uses mutual information for feature selection in combining nearest neighbor classifiers using the fuzzy integral.…”
Section: Introductionmentioning
confidence: 99%