2013
DOI: 10.1186/1471-2105-14-285
|View full text |Cite
|
Sign up to set email alerts
|

Using PPI network autocorrelation in hierarchical multi-label classification trees for gene function prediction

Abstract: BackgroundOntologies and catalogs of gene functions, such as the Gene Ontology (GO) and MIPS-FUN, assume that functional classes are organized hierarchically, that is, general functions include more specific ones. This has recently motivated the development of several machine learning algorithms for gene function prediction that leverages on this hierarchical organization where instances may belong to multiple classes. In addition, it is possible to exploit relationships among examples, since it is plausible t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
32
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
7
3

Relationship

1
9

Authors

Journals

citations
Cited by 45 publications
(34 citation statements)
references
References 50 publications
2
32
0
Order By: Relevance
“…This is in line with recent research on gene function classification when PPI network data can be exploited [45]. …”
Section: Discussionsupporting
confidence: 86%
“…This is in line with recent research on gene function classification when PPI network data can be exploited [45]. …”
Section: Discussionsupporting
confidence: 86%
“…As its name suggests, the algorithm adaptation strategy consists of adapting a traditional algorithm to handle hierarchical constraints. Masera and Blanzieri [6] created a neural network whose architecture incorporates the underlying hierarchy, making gradient updates flow from the neurons associated to the leaves up neurons associated to their parent nodes; Sun et al [8] proposed to use Partial Least Squares to reduce both label and feature dimension, followed by an optimal path selection algorithm; Barros et al [17] proposed a centroid based method where the training data is initially clustered, then predictions are performed by measuring the distance between the new instance and all clusters, the label set associated to the closest cluster is given as the prediction; Borges and Nievola [31] developed a competitive neural network whose architecture replicates the hierarchy; Vens et al [2] also proposed to train a single Predictive Clustering Tree for the entire hierarchy; as an extension of [2], Schietgat et al [21] proposed to use ensemble of Predictive Clustering Trees; Stojanova et al [18] proposed a slight modification for Predictive Clustering Trees in which the correlation between the proteins is also used to build the tree.…”
Section: Related Workmentioning
confidence: 99%
“…In particular, let us recall the works concerning decision trees (e.g., [45], [30], and [40]), neural networks (e.g., [37] and [9]), support vector machines (e.g., [43], [12] and [6], and evolutionary computation (e.g., [23], [24], [36] and [10]). …”
Section: Further Relevant Issues For Hcmentioning
confidence: 99%