2017
DOI: 10.1007/978-3-319-67588-6_10
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection with a Genetic Algorithm for Classification of Brain Imaging Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 21 publications
(12 citation statements)
references
References 43 publications
0
10
0
Order By: Relevance
“…As a part of future work, one can: (i) consider further information regarding IT ticket complexity prediction, such as the number of tasks and configuration items per ticket; (ii) test other application cases in the IT ticket area and beyond, i.e., further explore the potential of linguistic features; (iii) as we showed that selecting an appropriate subset of linguistic features can considerably improve the performance of classifiers, one may conduct further experiments with more advanced feature selection techniques [100].…”
Section: Discussionmentioning
confidence: 99%
“…As a part of future work, one can: (i) consider further information regarding IT ticket complexity prediction, such as the number of tasks and configuration items per ticket; (ii) test other application cases in the IT ticket area and beyond, i.e., further explore the potential of linguistic features; (iii) as we showed that selecting an appropriate subset of linguistic features can considerably improve the performance of classifiers, one may conduct further experiments with more advanced feature selection techniques [100].…”
Section: Discussionmentioning
confidence: 99%
“…It typi-cally causes the curse of dimensionality which generate the classification accuracy with false enhancement [28]. To address this issue, numerous feature selection methods have been proposed such as t-test, least absolute shrinkage and selection operator (LASSO) [29], genetic algorithm (GA) [30] and so on. In this work, we only adopt the simple feature selection method, i.e.…”
Section: Fbn-based Disease Classificationmentioning
confidence: 99%
“…For the purpose of this study, we employed the nearest neighbors ( K-NN) classifier to evaluate the candidate feature subset and as a reward function. We used the simple and efficient nearest neighbors classifier as it is well-understood in the literature and works surprisingly well in many situations [ 41 , 42 , 43 , 44 ]. Moreover, many other similar studies and comparison methods in literature, mentioned in Section 5.3 , have applied the nearest neighbors classifier and therefore, we considered it to be a better choice for the comparative analysis.…”
Section: Motifs (Monte Carlo Tree Search Based Feature Selection)mentioning
confidence: 99%