2016
DOI: 10.5120/ijca2016911441
|View full text |Cite
|
Sign up to set email alerts
|

Survey on different Methods for Classifying Gene Expression using Microarray Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…With this approach, it is possible to select features tailored for the induction algorithm ( Jadhav et al, 2018 ). The classification algorithm’s evaluation measures will be optimized while eliminating the features, hence offering better accuracy than the filter approach ( Inza et al, 2004 ; Mohamed et al, 2016 ).…”
Section: Gene Selection – Background and Developmentmentioning
confidence: 99%
See 1 more Smart Citation
“…With this approach, it is possible to select features tailored for the induction algorithm ( Jadhav et al, 2018 ). The classification algorithm’s evaluation measures will be optimized while eliminating the features, hence offering better accuracy than the filter approach ( Inza et al, 2004 ; Mohamed et al, 2016 ).…”
Section: Gene Selection – Background and Developmentmentioning
confidence: 99%
“…The significant advantage of using a wrapper approach, as both feature subset generation and the induction algorithm are wrapped together; the model will have the ability to track the feature dependencies ( Rodrigues et al, 2014 ). The common drawback is that it becomes computationally intensive for datasets with high dimensions ( Mohamed et al, 2016 ). Examples of Wrapper techniques are Hill Climbing, Forward Selection, and Backward Elimination.…”
Section: Gene Selection – Background and Developmentmentioning
confidence: 99%
“…Another common way to address the data high-dimensionality challenge is to use classification algorithms that control the model’s complexity through regularization [ 16 , 17 ]. One option is to regularize the log-likelihood function of the LR model.…”
Section: Introductionmentioning
confidence: 99%