2013
DOI: 10.1186/1752-0509-7-106
|View full text |Cite
|
Sign up to set email alerts
|

ENNET: inferring large gene regulatory networks from expression data using gradient boosting

Abstract: BackgroundThe regulation of gene expression by transcription factors is a key determinant of cellular phenotypes. Deciphering genome-wide networks that capture which transcription factors regulate which genes is one of the major efforts towards understanding and accurate modeling of living systems. However, reverse-engineering the network from gene expression profiles remains a challenge, because the data are noisy, high dimensional and sparse, and the regulation is often obscured by indirect connections.Resul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
43
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 39 publications
(43 citation statements)
references
References 47 publications
0
43
0
Order By: Relevance
“…Gradient boosting was also successfully used for inferring gene regulatory networks from steady-state gene expression data [9, 30]. The gradient boosting algorithm follows the gradient descent procedure that is employed to minimize the loss L of an estimator f by adding residual fitted estimator h [28].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Gradient boosting was also successfully used for inferring gene regulatory networks from steady-state gene expression data [9, 30]. The gradient boosting algorithm follows the gradient descent procedure that is employed to minimize the loss L of an estimator f by adding residual fitted estimator h [28].…”
Section: Methodsmentioning
confidence: 99%
“…A good deal of research on reverse-engineering has been conducted using the gene expression data [69]. In the DREAM (Dialogue for Reverse Engineering Assessments and Methods) Challenges, methods were employed to construct a benchmark dataset that can be used to validate various inference algorithms [10, 11].…”
Section: Introductionmentioning
confidence: 99%
“…Boosting utilizes an iterative strategy to slowly improve the model and should be highly accurate. For the DREAM5 datasets, the community network based on meta-analysis was robust and outperformed individual methods, and two ensemble-based methods achieved higher accuracy than individual methods [47,63].…”
Section: Meta-predictionmentioning
confidence: 99%
“…For example, correlation coefficient methods are more reliable than regression methods for feed-forward loops, whereas regression methods perform better for linear cascades. Because of the complementarity of various methods, metaprediction approaches that combine multiple methods may be able to outperform individual methods [47,50,[61][62][63].…”
Section: Meta-predictionmentioning
confidence: 99%
“…As using random forests for feature selection is not well understood theoretically, the TIGRESS method [20] uses least angle regression (LARS) with stability selection combined to solve the GRN inference problem. The ENNET method [21] aggregates the features selected by an algorithm based on Gradient Boosting Machine. However, the ENNET method has high computational cost when it is applied on the high-dimensional data (i.e., the data with thousands of features).…”
Section: Introductionmentioning
confidence: 99%