2009
DOI: 10.1016/j.compbiomed.2009.02.002
|View full text |Cite
|
Sign up to set email alerts
|

Hedged predictions for traditional Chinese chronic gastritis diagnosis with confidence machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 17 publications
(8 citation statements)
references
References 22 publications
0
8
0
Order By: Relevance
“…De Lobel et al [54] have used RF as a pre-screening method to remove noisy SNPs before multifactor-dimensionality reduction in genetic association studies. Additionally, RF has been incorporated in a transductive confidence machine [95], a framework that allows the prediction of classifiers to be complemented with a confidence value that can be set by the user prior to classification [103]. …”
Section: Rf In the Life Sciencesmentioning
confidence: 99%
“…De Lobel et al [54] have used RF as a pre-screening method to remove noisy SNPs before multifactor-dimensionality reduction in genetic association studies. Additionally, RF has been incorporated in a transductive confidence machine [95], a framework that allows the prediction of classifiers to be complemented with a confidence value that can be set by the user prior to classification [103]. …”
Section: Rf In the Life Sciencesmentioning
confidence: 99%
“…Thus the corresponding nonconformity score will be rather small, and vice versa. Therefore, the nonconformity measurement can exactly reflect the nonconformity of the example [36], [37].…”
Section: Methodsmentioning
confidence: 99%
“…The method which combines CP and RF, namely CP-RF, was firstly proposed to deal with single-label classification problem in our previous work [36]. Unfortunately, CP-RF cannot be directly applied to syndrome differentiation of CF for it is a MLL problem.…”
Section: Introductionmentioning
confidence: 99%
“…Considering outlyingness of an example, in this way, relative to all cases within a class, however, does not make adequate use of local neighborhood patterns (where, arguably, tree based models derive much of their advantage from). The authors, in another paper [25], propose a nonconformity measure defined similarly to k-nearest neighbor conformal predictors [17,22], but with distance determined by inverse RF proximity. Reference [8] suggests a simple and intuitive measure of nonconformity based on the proportion of trees that correctly classify an example, and compares this with the k-proximity measure of [25].…”
Section: Random Forestsmentioning
confidence: 99%
“…The development of conformal prediction for k-nearest neighbors and support vector machines as underlying learning algorithms has been described in [22]. It has been applied with neural networks [16], regression, and recent work considers its use with random forests [8,25,26]. Most of this work uses CP in a transductive inference setting, which requires model re-learning for each new example to be predicted, and can be computationally burdensome.…”
Section: Introductionmentioning
confidence: 99%