2020
DOI: 10.1016/j.measurement.2020.107885
|View full text |Cite
|
Sign up to set email alerts
|

An efficient Pearson correlation based improved random forest classification for protein structure prediction techniques

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 30 publications
(11 citation statements)
references
References 17 publications
0
10
0
1
Order By: Relevance
“…Hence, if coefficient value is closer to 0 implies weaker correlation, while zero coefficient value implies no correlation. Pearson coefficient [50], denoted as p c , is defined as:…”
Section: ) Feature Selectionmentioning
confidence: 99%
“…Hence, if coefficient value is closer to 0 implies weaker correlation, while zero coefficient value implies no correlation. Pearson coefficient [50], denoted as p c , is defined as:…”
Section: ) Feature Selectionmentioning
confidence: 99%
“…In recent years, ensemble learning can organically combine multiple prediction results obtained by multiple single learning models to obtain more accurate, stable and strong final results. For exampleensemble learning models such as Boosting, Bagging and RF have been proposed one after another and applied to various types of data sets [29,30]. In this study, through the deep learning of the ensemble learning model, six common machine learning classification algorithms have been selected for comparative experiments, including RF, multi-label naive Bayes (NB), and multi-label SVM (SVM), multi-label logistic regression (LR), GCForest.…”
Section: Discussionmentioning
confidence: 99%
“…The standard Brownian motion [31] is a stochastic process for which the step is given by a probability function defined by the zero mean (µ = 0) and unit variance (σ 2 = 1) of the normal (Gaussian) distribution. The control density function of the motion at point x is as follows:…”
Section: Exploratory Phase Of High-speed Ratiomentioning
confidence: 99%
“…To utilize the minimum number of trees for classification, Paul et al [30] proposed an improved random forest classifier based on the number of important and unimportant features, which iteratively removes some unimportant features. To improve the protein structure prediction performance, Kalaiselvi et al [31] introduced an improved random forest classification (WPC-IRFC) technique based on weighted Pearson correlation, which has higher accuracy and shorter time. In response to the fact that SVMs do not adequately consider the distinction between numerical and nominal attributes, Peng et al [32] proposed a novel SVM algorithm for heterogeneous data learning, which embeds nominal attributes into the real space by minimizing the estimated generalization error.…”
Section: Introductionmentioning
confidence: 99%