2016
DOI: 10.1186/s12859-015-0852-1
|View full text |Cite
|
Sign up to set email alerts
|

Learning a hierarchical representation of the yeast transcriptomic machinery using an autoencoder model

Abstract: BackgroundA living cell has a complex, hierarchically organized signaling system that encodes and assimilates diverse environmental and intracellular signals, and it further transmits signals that control cellular responses, including a tightly controlled transcriptional program. An important and yet challenging task in systems biology is to reconstruct cellular signaling system in a data-driven manner. In this study, we investigate the utility of deep hierarchical neural networks in learning and representing … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
94
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
1
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 94 publications
(98 citation statements)
references
References 24 publications
1
94
0
Order By: Relevance
“…7,8,32 For most features, the distribution of gene weights was similar: Many genes had weights near zero and few genes had high weights at each tail. In order to characterize patterns explained by selected encoded features of interest, we performed overrepresentation pathway analyses (ORA) separately for both positive and negative high weight genes; defined by greater than 2.5 standard deviations above or below the mean, respectively.…”
Section: Interpretation Of Gene Weightsmentioning
confidence: 99%
See 2 more Smart Citations
“…7,8,32 For most features, the distribution of gene weights was similar: Many genes had weights near zero and few genes had high weights at each tail. In order to characterize patterns explained by selected encoded features of interest, we performed overrepresentation pathway analyses (ORA) separately for both positive and negative high weight genes; defined by greater than 2.5 standard deviations above or below the mean, respectively.…”
Section: Interpretation Of Gene Weightsmentioning
confidence: 99%
“…7,8,32 For instance, there were only 17 genes needed to identify patient sex ( Figure 3C). These genes were mostly located on sex chromosomes.…”
Section: Features Represent Biological Signalmentioning
confidence: 99%
See 1 more Smart Citation
“…It has been shown that the expression profiles regenerated by autoencoders improve the performance of gene clustering. In another study, unsupervised model like restricted Boltzmann machines and sparse autoencoders are used on yeast transcriptomic data [8]. By integrating knowledge from gene ontology and KEGG in the networks, they show that these models are capable of learning biologically sensible representations of the data and revealing novel insights regarding the machinery regulating gene expression.…”
Section: Deep Learning For Gene Expression Analysismentioning
confidence: 99%
“…The latent representation is tuned to have low-capacity, so the model is forced to only estimate the most important features of the data. Different auto-encoder models have previously been used to model normalised (or binarised) bulk gene expression levels: denoising auto-encoders (Tan et al, 2014;Gupta et al, 2015;Tan et al, 2016), sparse auto-encoders (Chen et al, 2016), and robust autoencoders (Cui et al, 2017). A bottleneck auto-encoder (Eraslan et al, 2018) was also recently used to model single-cell transcript counts, and a generative adversarial network (Goodfellow et al, 2014), which is a related model, was recently used to model normalised single-cell gene expression levels (Ghahramani et al, 2018).…”
Section: Introductionmentioning
confidence: 99%