2018 IEEE International Conference on Data Mining (ICDM) 2018
DOI: 10.1109/icdm.2018.00158
|View full text |Cite
|
Sign up to set email alerts
|

Improving Deep Forest by Confidence Screening

Abstract: Most studies about deep learning are based on neural network models, where many layers of parameterized nonlinear differentiable modules are trained by backpropagation. Recently, it has been shown that deep learning can also be realized by non-differentiable modules without backpropagation training called deep forest. The developed representation learning process is based on a cascade of cascades of decision tree forests, where the high memory requirement and the high time cost inhibit the training of large mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 44 publications
(26 citation statements)
references
References 15 publications
(19 reference statements)
0
26
0
Order By: Relevance
“…MDForest consists of two parts: multi-dimensional and multi-grained scanning and dense cascade forest. 32,33 Multi-dimensional and multi-grained scanning realizes the re-representation process of features. The multi-scale features are captured by a multi-grained scanning mechanism to enhance the representation learning ability of the cascade forest.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…MDForest consists of two parts: multi-dimensional and multi-grained scanning and dense cascade forest. 32,33 Multi-dimensional and multi-grained scanning realizes the re-representation process of features. The multi-scale features are captured by a multi-grained scanning mechanism to enhance the representation learning ability of the cascade forest.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Different gene expression information had strong correlation, which means that it is better not to discretize them when classifying cancer subtypes on gene expression data. This will lead to various classifiers (such as decision trees and random forests) showing poor performance in the classification of cancer subtypes [ 32 ]. FNT is a new form of neural network, which has many advantages that traditional deep neural network models do not have (for example, the structure and parameters of the model can be automatically optimized, feature selection can be performed automatically, etc.)…”
Section: Methodsmentioning
confidence: 99%
“…The resulting problem is that the feature vector after multigrained scanning is a dense feature vector, which reduces the efficiency of subsequent Cascade Forest training and inference, and increases memory consumption and running time. Based on the above problems, we introduce the operation of Pang [33] to randomly sample the features of multi-grained scanning to improve training efficiency.…”
Section: B Sub-sampling Multi-grained Scanningmentioning
confidence: 99%
“…In order to evaluate the performance of the RES-gcForest, compared with other existing technologies, including Cascade Forest, Random Forest (RF) [33], Neural Network (NN), Convolutional Neural Network (CNN), gcForest, and ResNet.…”
Section: Comparison Modelmentioning
confidence: 99%