2002
DOI: 10.1145/568574.568577
|View full text |Cite
|
Sign up to set email alerts
|

Classification trees for problems with monotonicity constraints

Abstract: For classification problems with ordinal attributes very often the class attribute should increase with each or some of the explaining attributes. These are called classification problems with monotonicity constraints. Classical decision tree algorithms such as CART or C4.5 generally do not produce monotone trees, even if the dataset is completely monotone. This paper surveys the methods that have so far been proposed for generating decision trees that satisfy monotonicity constraints. A distinction is made be… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
60
0

Year Published

2006
2006
2021
2021

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 109 publications
(61 citation statements)
references
References 10 publications
1
60
0
Order By: Relevance
“…With the help of monotonicity generating classification trees that satisfies the constraint property, both when training data which satisfies monotone and when it is not. [1] ii. Constraint frequent pattern mining with a patter growth view finds all frequent itemset that satisfy the constraint and then the pattern growth mining method generates and test only a few among them.…”
Section: Constraint Based Approachesmentioning
confidence: 99%
“…With the help of monotonicity generating classification trees that satisfies the constraint property, both when training data which satisfies monotone and when it is not. [1] ii. Constraint frequent pattern mining with a patter growth view finds all frequent itemset that satisfy the constraint and then the pattern growth mining method generates and test only a few among them.…”
Section: Constraint Based Approachesmentioning
confidence: 99%
“…Meanwhile, several machine learning algorithms have been modified so as to guarantee monotonicity in attributes, including nearest neighbor classification [15], decision tree learning [16] and rule induction [17]. Instead of modifying models and algorithms, one can also modify the data.…”
Section: Related Workmentioning
confidence: 99%
“…Most notably, it combines three features in a non-trivial way, namely monotonicity, nonlinearity and interpretability. As for the first, a monotone dependence between the input and output attributes is often desirable in a classification setting and sometimes even requested by the application [11,12,13]. At the same time, the Choquet integral also allows for modeling interactions between different attributes in a flexible, nonlinear way.…”
Section: Introductionmentioning
confidence: 99%