2005
DOI: 10.1016/j.bbrc.2005.06.075
|View full text |Cite
|
Sign up to set email alerts
|

Boosting classifier for predicting protein domain structural class

Abstract: A novel classifier, the so-called ''LogitBoost'' classifier, was introduced to predict the structural class of a protein domain according to its amino acid sequence. LogitBoost is featured by introducing a log-likelihood loss function to reduce the sensitivity to noise and outliers, as well as by performing classification via combining many weak classifiers together to build up a very strong and robust classifier. It was demonstrated thru jackknife cross-validation tests that LogitBoost outperformed other clas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
54
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
10

Relationship

3
7

Authors

Journals

citations
Cited by 123 publications
(54 citation statements)
references
References 54 publications
0
54
0
Order By: Relevance
“…These algorithms include neural network (Cai and Zhou 2000), support vector machine (SVM) (Cai et al 2001;Chen et al 2006a;Li et al 2008;Qiu et al 2009), fuzzy k-nearest neighbor (Zhang et al 2008, fuzzy clustering (Shen et al 2005), Bayesian classification (Wang and Yuan 2000), logistic regression (Kurgan and Chen 2007;Kurgan and Homaeian 2006), rough sets (Cao et al 2006) and classifier fusion techniques (Cai et al 2006;Chen et al 2006bChen et al , 2009Feng et al 2005;Kedarisetti et al 2006). Among them, SVM is the most popular and the best-performing classifier for this task (Kurgan et al 2008a).…”
Section: Introductionmentioning
confidence: 98%
“…These algorithms include neural network (Cai and Zhou 2000), support vector machine (SVM) (Cai et al 2001;Chen et al 2006a;Li et al 2008;Qiu et al 2009), fuzzy k-nearest neighbor (Zhang et al 2008, fuzzy clustering (Shen et al 2005), Bayesian classification (Wang and Yuan 2000), logistic regression (Kurgan and Chen 2007;Kurgan and Homaeian 2006), rough sets (Cao et al 2006) and classifier fusion techniques (Cai et al 2006;Chen et al 2006bChen et al , 2009Feng et al 2005;Kedarisetti et al 2006). Among them, SVM is the most popular and the best-performing classifier for this task (Kurgan et al 2008a).…”
Section: Introductionmentioning
confidence: 98%
“…SVM is a new paradigm of learning system. The technique of SVM, developed by Vapnik [15], is a powerful widely used technique for solving supervised biological classification problems due to its generalization ability [16][17][18]. In essence, SVM classifiers maximize the margin between training data and the decision boundary (optimal separating hyperplane), which can be formulated as a quadratic optimization problem in a feature space.…”
Section: Support Vector Machinementioning
confidence: 99%
“…[13] Recently, predicted secondary structural sequences (PSSF) were successfully used in the prediction of protein structural class. [14][15][16][17] In addition to the representations of protein information, various classification algorithms have already been introduced to the structural class prediction methods, including component coupled algorithm, [9] support vector machine (SVM), [18] rough sets, [19] and LogitBoost, [20] and more details on structural class prediction methods were reviewed by Chou.…”
Section: Introductionmentioning
confidence: 99%