2011
DOI: 10.4137/ebo.s7931
|View full text |Cite
|
Sign up to set email alerts
|

Profiles and Majority Voting-Based Ensemble Method for Protein Secondary Structure Prediction

Abstract: Machine learning techniques have been widely applied to solve the problem of predicting protein secondary structure from the amino acid sequence. They have gained substantial success in this research area. Many methods have been used including k-Nearest Neighbors (k-NNs), Hidden Markov Models (HMMs), Artificial Neural Networks (ANNs) and Support Vector Machines (SVMs), which have attracted attention recently. Today, the main goal remains to improve the prediction quality of the secondary structure elements. Th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
14
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 17 publications
(16 citation statements)
references
References 48 publications
1
14
0
Order By: Relevance
“…The so-called "Ideal Fold Selection" (IFS) predictor thus assembles the 7 best predicted folds to form the predicted conformational states of the entire dataset. Furthermore, a main condition for obtaining coherent secondary structures is that the shortest length of consecutive states H must be 3 and 2 for consecutive states E. To eliminate unrealistic structures, the resulting predictions for each conformational state are then refined by applying the heuristic-based filter used in Bouziane et al (2011). The results are thus reported in Table 9.…”
Section: Improving the Consensus Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The so-called "Ideal Fold Selection" (IFS) predictor thus assembles the 7 best predicted folds to form the predicted conformational states of the entire dataset. Furthermore, a main condition for obtaining coherent secondary structures is that the shortest length of consecutive states H must be 3 and 2 for consecutive states E. To eliminate unrealistic structures, the resulting predictions for each conformational state are then refined by applying the heuristic-based filter used in Bouziane et al (2011). The results are thus reported in Table 9.…”
Section: Improving the Consensus Resultsmentioning
confidence: 99%
“…The claimed Q 3 score varies between 75 and 80 %, depending on the benchmark datasets used (Bouziane et al 2011). Presumably, all existing machine learning algorithms have been applied to the PSSP problem and further improvements of few percentage points are still required.…”
Section: Protein Secondary Structure Prediction Problemmentioning
confidence: 99%
“…All the GPCR superfamily contains seven highly conserved 7TM regions with the feature of hydrophobicity; these 7TM can be identified by Hidden Markov Models (HMMs) and machine-learning methods [ 68 ]. The GPCRs structure researchers revealed that the classical sequence contained the following: the seven-transmembrane segments [TM1–7], three extracellular loops [EL1–3], three intracellular loops [IL1–3], and the protein termini.…”
Section: Discussionmentioning
confidence: 99%
“…An ensemble method combines the output of individual homogenous classifiers applied to the given big datasets. Output from each ensemble is selected and combined using majority voting rule [39]. DMLP is a multiple feed forward artificial neural network that maps input vectors to that of the output vectors.…”
Section: Ensemble Incremental Deep Multiple Layer Perceptron (Eidmlp)mentioning
confidence: 99%