2006
DOI: 10.1021/ci0600354
|View full text |Cite
|
Sign up to set email alerts
|

Supervised Feature Ranking Using a Genetic Algorithm Optimized Artificial Neural Network

Abstract: A genetic algorithm optimized artificial neural network GNW has been designed to rank features for two diversified multivariate data sets. The dimensions of these data sets are 85 × 24 and 62 × 25 for 24 or 25 molecular descriptors being computed for 85 matrix metalloproteinase-1 inhibitors or 62 hepatitis C virus NS3 protease inhibitors, respectively. Each molecular descriptor computed is treated as a feature and input into an input layer node of the artificial neural network. To optimize the artificial neura… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2006
2006
2020
2020

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…The searching range of all the rotatable bonds was set between 180 and −180. There was a certain population of chromosomes generated within each generation, and two out of them were selected by a roulette wheel mechanism for crossover operation. We proceeded with two-points crossover on the two chromosomes randomly selected by first picking out two crossover sites randomly.…”
Section: Methodsmentioning
confidence: 99%
“…The searching range of all the rotatable bonds was set between 180 and −180. There was a certain population of chromosomes generated within each generation, and two out of them were selected by a roulette wheel mechanism for crossover operation. We proceeded with two-points crossover on the two chromosomes randomly selected by first picking out two crossover sites randomly.…”
Section: Methodsmentioning
confidence: 99%
“…Many FS strategies use evolutionary algorithms [6,23,24], given that they allow a stochastic and parallel search of the possible solutions of a problem, and hence they are able to escape from local minima. There are also other approaches based on stepwise strategies which perform a greedy search of the best subsets of variables [22].…”
Section: Related Workmentioning
confidence: 99%
“…They have been compared to other methods, such as logistic regression, [9][10][11][12] k-nearest neighbors, [13] and neural networks. [9,[12][13][14] The results have been heterogeneous, but in many published reports SVMs have been found to be equivalent to more traditional types of models. [9,10,15,16] In some domains, SVMs were claimed to have outperformed other methodologies.…”
Section: Introductionmentioning
confidence: 99%