2016
DOI: 10.1007/s10618-016-0460-3
|View full text |Cite
|
Sign up to set email alerts
|

Evidence-based uncertainty sampling for active learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
51
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 76 publications
(51 citation statements)
references
References 18 publications
0
51
0
Order By: Relevance
“…The decision tree is a flowchart-like tree structure, where each non-leaf node represents a test on an attribute, each branch represents an outcome of the test, and leaf nodes represent target classes or class distributions [3]. We decided to use this method because we were able to visualize the tree or to extract the decision rules.…”
Section: B Methodsmentioning
confidence: 99%
“…The decision tree is a flowchart-like tree structure, where each non-leaf node represents a test on an attribute, each branch represents an outcome of the test, and leaf nodes represent target classes or class distributions [3]. We decided to use this method because we were able to visualize the tree or to extract the decision rules.…”
Section: B Methodsmentioning
confidence: 99%
“…In 2016, Sharma et al further divided the traditional uncertainty into two categories according to the reason of instance uncertainty, namely conflicting-evidence uncertainty (UNC-CE) and insufficient-evidence uncertainty (UCN-IE) [7]. The experiments also showed that the conflicting uncertain instance has a more obvious effect on the improvement of classification performance.…”
Section: Active Learning Strategymentioning
confidence: 99%
“…The evidence of an uncertain instance is estimated and used in a batch scenario or pool setting in [7]. However, in the data stream setting, it is impossible to store all instances in the whole data stream and calculate the evidence of all unlabeled instances for ranking, and then pick some unlabeled instances that have higher evidence and lower confidence than the other instances.…”
Section: Algorithm 1 Esplitmentioning
confidence: 99%
See 2 more Smart Citations