2021
DOI: 10.1371/journal.pone.0243915
|View full text |Cite
|
Sign up to set email alerts
|

On transformative adaptive activation functions in neural networks for gene expression inference

Abstract: Gene expression profiling was made more cost-effective by the NIH LINCS program that profiles only ∼1, 000 selected landmark genes and uses them to reconstruct the whole profile. The D–GEX method employs neural networks to infer the entire profile. However, the original D–GEX can be significantly improved. We propose a novel transformative adaptive activation function that improves the gene expression inference even further and which generalizes several existing adaptive activation functions. Our improved neur… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
115
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 10 publications
(115 citation statements)
references
References 50 publications
0
115
0
Order By: Relevance
“…Wang et al [ 20 ] use Conditional Generative Adversarial Networks to model the conditional probability of target genes given landmark genes. Kunc and Kléma [ 21 ] substitutes the hyperbolic tangent function with transformative adaptive activation functions to improve the prediction accuracy. Wang et al [ 22 ] use a recurrent neural network called L-GEPM to model the non-linear features of the landmark genes.…”
Section: Related Workmentioning
confidence: 99%
“…Wang et al [ 20 ] use Conditional Generative Adversarial Networks to model the conditional probability of target genes given landmark genes. Kunc and Kléma [ 21 ] substitutes the hyperbolic tangent function with transformative adaptive activation functions to improve the prediction accuracy. Wang et al [ 22 ] use a recurrent neural network called L-GEPM to model the non-linear features of the landmark genes.…”
Section: Related Workmentioning
confidence: 99%
“…Goyal et al [80] suggested to normalize polynomial activations to increase the stability of neural networks. Kunc and Klěma [81] proposed a novel transformative adaptive activation function that improves the gene expression inference by generalizing existing adaptive activation functions.…”
Section: B Adaptive Activationmentioning
confidence: 99%
“…The transformative adaptive activation function (TAAF) [380,381] is a family of AFs that adds four adaptive parameters for scaling and translation of any given AF -as such, the TAAFs represent a simple framework with a small set of additional parameters that generalizes a lot of AAFs that are listed in this work. While there are even more general approaches such as the adaptive blending unit (ABU) (see section 4.48) that allows for a combination of several different activation functions, the TAAFs are conceptually simpler and add only four additional parameters.…”
Section: Transformative Adaptive Activation Function (Taaf)mentioning
confidence: 99%