2018
DOI: 10.1016/j.patcog.2018.07.009
|View full text |Cite
|
Sign up to set email alerts
|

Multi-view label embedding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 52 publications
(23 citation statements)
references
References 14 publications
0
20
0
Order By: Relevance
“…Feature fusion can boost the recognition performance by combining the complementary information of different features (Zhu et al, 2016, 2018c). A 588-dimensional feature set was obtained by combining the features of the 188- and 400-dimensional feature sets, and a 661-dimensional feature set was obtained by combining the features of the 188- and 473-dimensional feature sets.…”
Section: Methodsmentioning
confidence: 99%
“…Feature fusion can boost the recognition performance by combining the complementary information of different features (Zhu et al, 2016, 2018c). A 588-dimensional feature set was obtained by combining the features of the 188- and 400-dimensional feature sets, and a 661-dimensional feature set was obtained by combining the features of the 188- and 473-dimensional feature sets.…”
Section: Methodsmentioning
confidence: 99%
“…As discussed in previous studies, feature extraction is a key step for constructing a computational predictor in bioinformatics. Primary structure of protein is represented as a sequence of letter; each of them denotes an amino acid residue.…”
Section: Methodsmentioning
confidence: 99%
“…SVM has been widely used in the bioinformatics fields for its excellent performance in classification . SVM transforms the data into a high‐dimensional space and then calculates the optimal separation of hyperplane among the data marked with different labels.…”
Section: Methodsmentioning
confidence: 99%
“…RF is implemented by constructing a large number of decision trees during training and outputting the class pattern of individual trees [55]. The RF algorithm behaves similar to the ensemble algorithm [74,75]; it consists of decision trees, and each is grown according to a subset of features selected by the stochastic feature selection technique. The feature number of each tree is determined by a number of factors, including generalization errors, classifier strength, and dependence.…”
Section: ) Random Forestmentioning
confidence: 99%