2012 7th International Conference on Electrical and Computer Engineering 2012
DOI: 10.1109/icece.2012.6471636
|View full text |Cite
|
Sign up to set email alerts
|

Knowledge based decision tree construction with feature importance domain knowledge

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(10 citation statements)
references
References 2 publications
0
9
0
1
Order By: Relevance
“…This algorithm was used in this study on raw reflectance by wavelengths as it is well suited for a large amount of data. Indeed, decision tree algorithms recursively split the data set to learn accurately [64]. ExtraTrees is a machine learning method similar to Random Forest, except that it tends to have a lower variance: instead of searching for the optimal feature/split combination, for each feature, a random value is selected for the split [65].…”
Section: Extra Trees Modelmentioning
confidence: 99%
“…This algorithm was used in this study on raw reflectance by wavelengths as it is well suited for a large amount of data. Indeed, decision tree algorithms recursively split the data set to learn accurately [64]. ExtraTrees is a machine learning method similar to Random Forest, except that it tends to have a lower variance: instead of searching for the optimal feature/split combination, for each feature, a random value is selected for the split [65].…”
Section: Extra Trees Modelmentioning
confidence: 99%
“…The paper [1] presents the Importance Aided Decision Tree (IADT), which takes feature importance as an additional domain knowledge for enhancing the performance of learners. Decision tree algorithm finds the most important attributes in each node.…”
Section: Related Workmentioning
confidence: 99%
“…The termination condition of the algorithm has three kinds of cases. (1) No attribute can be used as testing attribute.…”
Section: Sprint Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore. MARS, MLP, and DTR model also belongs to nonparametric learning, and the model is used in those areas [27][28][29][30][31][32][33][34][35][36].…”
Section: Introductionmentioning
confidence: 99%