2018
DOI: 10.1007/978-3-319-77703-0_35
|View full text |Cite
|
Sign up to set email alerts
|

A Comparison of Feature Selection Methods to Optimize Predictive Models Based on Decision Forest Algorithms for Academic Data Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 6 publications
0
7
0
Order By: Relevance
“…The other applications of decision forest were prediction of different types of liver diseases including alcoholic, liver damage and liver cirrhosis [66]. Other than biomedical classification, Decision forest method was applied for academic data analysis [67] as well as classification and forecasting of chronic kidney disease [68]. Decision Jungles were used for feature selection for images with some modification to achieve efficient results with modest training time [69].…”
Section: Resultsmentioning
confidence: 99%
“…The other applications of decision forest were prediction of different types of liver diseases including alcoholic, liver damage and liver cirrhosis [66]. Other than biomedical classification, Decision forest method was applied for academic data analysis [67] as well as classification and forecasting of chronic kidney disease [68]. Decision Jungles were used for feature selection for images with some modification to achieve efficient results with modest training time [69].…”
Section: Resultsmentioning
confidence: 99%
“…For that purpose, after calculating the correlation coefficients' values, we performed significance analysis. A similar approach was followed in [68], where only the Pearson, Spearman, and Kendall correlation, mutual information (information gain), and χ 2 ranking technique were used. All of these methods can be used for ranking the features for the purpose of feature selection (choosing features with the best rank values for a particular method); however, their nature and purpose are different.…”
Section: Feature Selectionmentioning
confidence: 99%
“…Entropy based techniques and the χ 2 ranking technique, on the other hand, can be used when dealing with all types of features, and they are based on the statistical properties of the variables. These methods are commonly used in the feature selection domain [68]. Nevertheless, these methods are not perfect either, e.g., the χ 2 ranking technique does not perform well while dealing with infrequent terms in data [72], and information gain favors features with many uniformly distributed values [73].…”
Section: Feature Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…Ghaemi et al found FOA can improve the classification accuracy comparing with other feature selection methods [Ghaemi and Feizi-Derakhshi (2016)]. Fernández-García et al compared the FOA with other feature selection method on Academic Data to foresee if students will finish their degree after finishing their first year in college [Fernández-García, Iribarne, Corral et al (2018)]. Although the FOA get the better result in above experiments, the search efficiency is still low and the algorithm complexity is relative higher.…”
Section: Foamentioning
confidence: 99%