2006
DOI: 10.1109/tit.2006.871056
|View full text |Cite
|
Sign up to set email alerts
|

Minimax-optimal classification with dyadic decision trees

Abstract: Decision trees are among the most popular types of classifiers, with interpretability and ease of implementation being among their chief attributes. Despite the widespread use of decision trees, theoretical analysis of their performance has only begun to emerge in recent years. In this paper it is shown that a new family of decision trees, dyadic decision trees (DDTs), attain nearly optimal (in a minimax sense) rates of convergence for a broad range of classification problems. Furthermore, DDTs are surprisingl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
84
0

Year Published

2006
2006
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 80 publications
(87 citation statements)
references
References 46 publications
3
84
0
Order By: Relevance
“…For classification loss, we note that very recently (Scott & Nowak, 2006) have obtained, for a related penalized dyadic tree method (with a penalty function different from what we consider here), very interesting minimax results.…”
Section: Adaptation To Anisotropymentioning
confidence: 73%
See 1 more Smart Citation
“…For classification loss, we note that very recently (Scott & Nowak, 2006) have obtained, for a related penalized dyadic tree method (with a penalty function different from what we consider here), very interesting minimax results.…”
Section: Adaptation To Anisotropymentioning
confidence: 73%
“…Note however that other existing algorithms for dyadic decision trees Nowak, 2004, 2006;Klemelä, 2003) are all of complexity 2 dkmax , but that the authors choose k max of the order of d −1 log n. This makes sense in Scott and Nowak (2004), because the cuts are fixed in advance and the algorithm is not adaptive to anisotropy. However, in Klemelä (2003) the author notices that k max should be chosen as large as the computational complexity permits to take full advantage of the anisotropy adaptivity.…”
Section: Algorithmic Complexitymentioning
confidence: 99%
“…On the other hand, a simple structural risk minimization algorithm applied to cyclic dyadic decision trees (i.e. trees whose splits are determined by cycling through the coordinates and splitting at the interval mid-points) is guaranteed to be robust to distribution (Scott and Nowak 2003, 2004, 2006. Recently it has been shown that allowing the dyadic splits to be performed in arbitrary order, and then designing the tree to minimize a regularized risk, also yields robust performance guarantees (Scott and Nowak 2006;Blanchard et al 2007) and tends to give better results in practice (Blanchard et al 2007).…”
Section: Introductionmentioning
confidence: 99%
“…trees whose splits are determined by cycling through the coordinates and splitting at the interval mid-points) is guaranteed to be robust to distribution (Scott and Nowak 2003, 2004, 2006. Recently it has been shown that allowing the dyadic splits to be performed in arbitrary order, and then designing the tree to minimize a regularized risk, also yields robust performance guarantees (Scott and Nowak 2006;Blanchard et al 2007) and tends to give better results in practice (Blanchard et al 2007). The current best algorithm for designing these trees is the dynamic programming algorithm of Blanchard et al (2007) which was inspired by the "dyadic CART" algorithm of Donoho (1997).…”
Section: Introductionmentioning
confidence: 99%
“…Additive penalties increase linearly with the size of the decision tree. Recent results in statistical learning theory suggest that subadditive penalties, and in particular a penalty term that varies as the square root of the size of the tree, may be more appropriate for classification problems [1,[6][7][8][9]. A subadditive penalty is monotonic but its increase with the size of the tree is slower than linear.…”
Section: Introductionmentioning
confidence: 99%