2010
DOI: 10.1007/978-3-642-00580-0_2
|View full text |Cite
|
Sign up to set email alerts
|

A Bayes Evaluation Criterion for Decision Trees

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…A decision tree is a graph which can be used as a model of a categorical variable. A decision tree aims at predicting a categorical (numerical or linguistic) output variable from a set of numerical or linguistic input variables [11]. Decision trees are useful in solving classification and prediction problems [12,13].…”
Section: Decision Treementioning
confidence: 99%
“…A decision tree is a graph which can be used as a model of a categorical variable. A decision tree aims at predicting a categorical (numerical or linguistic) output variable from a set of numerical or linguistic input variables [11]. Decision trees are useful in solving classification and prediction problems [12,13].…”
Section: Decision Treementioning
confidence: 99%
“…The optimal model M is then the one with the least cost c (see original work [5] for explicit expression of p(M ) and p(D|M ) and for the optimization algorithm). The generic modl approach has also already been successfully applied to supervised value grouping [4] and decision tree construction [28]. In each instantiation, the modl method promotes a trade-off between (1) the fineness of the predictive information provided by the model and (2) the robustness in order to obtain a good generalization of the model.…”
Section: Towards Modl Rulesmentioning
confidence: 99%
“…Simple models such as Bernoulli or mainly multinomial distributions are important because they are easier to analyze theoretically and useful in many applications. For example, the multinomial distribution has been used as a building block in more complex models, such as naive Bayes classifiers (Mononen and Myllymäki, 2007), Bayesian networks (Roos et al, 2008), decision trees (Voisine et al, 2009) or coclustering models (Boullé, 2011;Guigourès et al, 2015). These models involve up to thousands of multinomials blocks, some of them with potentially very large numbers of occurrences and outcomes.…”
Section: Introductionmentioning
confidence: 99%