2016
DOI: 10.1016/j.csda.2015.11.006
|View full text |Cite
|
Sign up to set email alerts
|

HHCART: An oblique decision tree

Abstract: Decision trees are a popular technique in statistical data classification. They recursively partition the feature space into disjoint sub-regions until each sub-region becomes homogeneous with respect to a particular class. The basic Classification and Regression Tree (CART) algorithm partitions the feature space using axis parallel splits. When the true decision boundaries are not aligned with the feature axes, this approach can produce a complicated boundary structure. Oblique decision trees use oblique deci… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
47
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 66 publications
(51 citation statements)
references
References 15 publications
0
47
0
Order By: Relevance
“…Reflecting the feature vectors in this way makes d ik parallel to e and provides a simple and effective way to find oblique splits (Robertson, Price & Reale ; Wickramarachchi et al . ). The authors propose two HHCART methods.…”
Section: Related Decision Treesmentioning
confidence: 97%
See 3 more Smart Citations
“…Reflecting the feature vectors in this way makes d ik parallel to e and provides a simple and effective way to find oblique splits (Robertson, Price & Reale ; Wickramarachchi et al . ). The authors propose two HHCART methods.…”
Section: Related Decision Treesmentioning
confidence: 97%
“…HHCART (Wickramarachchi et al . ) is another oblique decision tree. Rather than searching for oblique splits directly, HHCART finds the best axis parallel split in a set of reflected feature spaces.…”
Section: Related Decision Treesmentioning
confidence: 99%
See 2 more Smart Citations
“…The merits of the decision tree classifier includes that the classification rule is simple and less computation effort is required. The algorithms of the decision trees that have been utilized broadly include ID3 [21], C4.5 [22], CHAID [23], CART [24], and QUEST [25]. Due to its flexible capability for continuous and discrete data processing, the C4.5 algorithm [22] was employed in this research to construct the decision tree model for the classification of different bearing defects.…”
Section: Decision Tree Classificationmentioning
confidence: 99%