2007
DOI: 10.1016/j.ijar.2007.02.004
|View full text |Cite
|
Sign up to set email alerts
|

Pruning belief decision tree methods in averaging and conjunctive approaches

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 24 publications
(7 citation statements)
references
References 13 publications
0
7
0
Order By: Relevance
“…To learn from data with soft labels, nonparametric techniques such as the evidential k-nearest neighbor rule [11] and decision trees [16,20,46] have first been proposed. A general mechanism for parametric inference in the presence of uncertain data (of which soft labels are a special case) was later introduced in [12].…”
Section: Introductionmentioning
confidence: 99%
“…To learn from data with soft labels, nonparametric techniques such as the evidential k-nearest neighbor rule [11] and decision trees [16,20,46] have first been proposed. A general mechanism for parametric inference in the presence of uncertain data (of which soft labels are a special case) was later introduced in [12].…”
Section: Introductionmentioning
confidence: 99%
“…This rule was applied to regression problems with uncertain dependent variable in [36]. Methods for building decision trees from partially supervised data were proposed in [37], [38], [39]. An extension of the k-mode clustering algorithm to data with uncertain attributes was introduced in [40].…”
Section: Introductionmentioning
confidence: 99%
“…Thus, previous problems are special cases of this general formulation. Other studies have already proposed solutions in which class labels are expressed by possibility distributions or belief functions [21,22,23,24]. These labels are interesting when they are supplied by one or several experts and when crisp assignments are hard to obtain.…”
Section: Introductionmentioning
confidence: 99%