2019
DOI: 10.1016/j.patcog.2018.11.011
|View full text |Cite
|
Sign up to set email alerts
|

Random forest dissimilarity based multi-view learning for Radiomics application

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
48
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 59 publications
(48 citation statements)
references
References 43 publications
0
48
0
Order By: Relevance
“…Prior to kernel construction, the forests of ∆ * are re-grown with 2000 trees each. Higher number of trees and smaller leaf size tend to provide better approximation to the class discrimination boundary, as demonstrated in [10].…”
Section: Aklimate Hyperparametersmentioning
confidence: 94%
“…Prior to kernel construction, the forests of ∆ * are re-grown with 2000 trees each. Higher number of trees and smaller leaf size tend to provide better approximation to the class discrimination boundary, as demonstrated in [10].…”
Section: Aklimate Hyperparametersmentioning
confidence: 94%
“…is still equal to 0, but in the opposite situation the resulting value is in ]0, 1]. A second variant [5], noted RFD in the following, leans on a measure of instance hardness, namely the κ-Disagreeing Neighbors (κDN) measure [21], that estimates the intrinsic difficulty to predict an instance as follows:…”
Section: Using Random Forest For Measuring Dissimilaritiesmentioning
confidence: 99%
“…Any of these variants could be used to compute the dissimilarities in our framework. However, we choose to use the RFD variant in the following, since it has been shown to give very good results when used for building dissimilarity representations for multi-view learning [5].…”
Section: Using Random Forest For Measuring Dissimilaritiesmentioning
confidence: 99%
See 2 more Smart Citations