2009
DOI: 10.1002/jcc.21230
|View full text |Cite
|
Sign up to set email alerts
|

Multiple classifier integration for the prediction of protein structural classes

Abstract: Supervised classifiers, such as artificial neural network, partition trees, and support vector machines, are often used for the prediction and analysis of biological data. However, choosing an appropriate classifier is not straightforward because each classifier has its own strengths and weaknesses, and each biological dataset has its own characteristics. By integrating many classifiers together, people can avoid the dilemma of choosing an individual classifier out of many to achieve an optimized classificatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0

Year Published

2009
2009
2016
2016

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 38 publications
(26 citation statements)
references
References 26 publications
0
24
0
Order By: Relevance
“…Additionally, multiclass cancer classification is possible in this frame work by extending the binary NPPC to multiclass NPPC. The proposed NPPC ensemble framework may well be of interest to others for noisy data sets in the fields of machine learning and computational biology, such as detection of horizontal gene transfer in bacterial genomes [88], classification of functional classes of proteins [89], classification of the nature of the infectious diseases [90], diagnosis of the genetic abnormalities [91], and other important areas of medical diagnostics.…”
Section: Resultsmentioning
confidence: 99%
“…Additionally, multiclass cancer classification is possible in this frame work by extending the binary NPPC to multiclass NPPC. The proposed NPPC ensemble framework may well be of interest to others for noisy data sets in the fields of machine learning and computational biology, such as detection of horizontal gene transfer in bacterial genomes [88], classification of functional classes of proteins [89], classification of the nature of the infectious diseases [90], diagnosis of the genetic abnormalities [91], and other important areas of medical diagnostics.…”
Section: Resultsmentioning
confidence: 99%
“…These algorithms include neural network (Cai and Zhou 2000), support vector machine (SVM) (Cai et al 2001;Chen et al 2006a;Li et al 2008;Qiu et al 2009), fuzzy k-nearest neighbor (Zhang et al 2008, fuzzy clustering (Shen et al 2005), Bayesian classification (Wang and Yuan 2000), logistic regression (Kurgan and Chen 2007;Kurgan and Homaeian 2006), rough sets (Cao et al 2006) and classifier fusion techniques (Cai et al 2006;Chen et al 2006bChen et al , 2009Feng et al 2005;Kedarisetti et al 2006). Among them, SVM is the most popular and the best-performing classifier for this task (Kurgan et al 2008a).…”
Section: Introductionmentioning
confidence: 98%
“…For convenience, the PSSM is denoted as is to feed these features to an appropriate classification algorithm to efficiently and accurately predict structural class. Up to now, a lot of machine-learning algorithms have been proposed, such as neural network [21], support vector machine (SVM) [22][23][24][25], fuzzy clustering [26], fuzzy k-nearest neighbor [27,28], Bayesian classification [29], logistic regression [30], rough sets [31] and classifier fusion techniques [32][33][34][35][36]. Among the aforementioned classification algorithms, SVM is the most reliable and attained excellent performance on the SCOP problem [19].…”
Section: Position-specific Scoring Matrixmentioning
confidence: 99%