2000
DOI: 10.1093/protein/13.1.15
|View full text |Cite
|
Sign up to set email alerts
|

Is it better to combine predictions?

Abstract: We have compared the accuracy of the individual protein secondary structure prediction methods: PHD, DSC, NNSSP and Predator against the accuracy obtained by combing the predictions of the methods. A range of ways of combing predictions were tested: voting, biased voting, linear discrimination, neural networks and decision trees. The combined methods that involve 'learning' (the non-voting methods) were trained using a set of 496 non-homologous domains; this dataset was biased as some of the secondary structur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2001
2001
2019
2019

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 38 publications
(18 citation statements)
references
References 30 publications
0
18
0
Order By: Relevance
“…43,51 Numerous algorithms, available through WEB servers, predict the location of secondary structure elements on the basis of the amino acid sequence, and some algorithms combine multiple methods. 52,53 Here again, we used a simple voting method for combining predictions from different algorithms and for defining consensus locations of predicted secondary structures (SS-score).…”
Section: Resultsmentioning
confidence: 99%
“…43,51 Numerous algorithms, available through WEB servers, predict the location of secondary structure elements on the basis of the amino acid sequence, and some algorithms combine multiple methods. 52,53 Here again, we used a simple voting method for combining predictions from different algorithms and for defining consensus locations of predicted secondary structures (SS-score).…”
Section: Resultsmentioning
confidence: 99%
“…Another feature of GPs is that because of the stochastic way in which they are initiated and evolve, they are not deterministic. This said, it is possible to turn such properties to advantage by running the GP several times, as there is plenty of evidence that combining or voting among several independent solutions to a problem can give improved learning (Drucker et al, 1994;Bauer and Kohavi, 1999;Dietterich, 2000aDietterich, , 2000bFriedman et al, 2000;King et al, 2000). Finally, all computerintensive methods of this type, including those based on purely multivariate statistical strategies (Martens and Naes, 1989), have a great many degrees of freedom, which require that we provide a careful evaluation with respect to the reality of the solutions found (Chatfield, 1995).…”
Section: Evolving Simple Answers To Complex Questions Of Functional Gmentioning
confidence: 99%
“…Further improvement in preformance of PSSP were also acheived by exploiting evolutioary information via multiple sequence alignments (MSAs) profiles [10], position-specific score matrices (PSSM) [15], homology detection using hidden markov models [16] and PSI-BLAST [17]. Other approaches have been applied to PSSP are support vector machines (SVM) [18], [19] and ensemble methods that combine some machine learning approaches [20] via the majority voting or weighted majority voting techniques. These methods could optain up to a 3% improvement in Q 3 accuracy over the best individual method.…”
Section: Introductionmentioning
confidence: 99%