2019
DOI: 10.1101/587287
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On Transformative Adaptive Activation Functions in Neural Networks for Gene Expression Inference

Abstract: Motivation: Gene expression profiling was made cheaper by the NIH LINCS program that profiles only ∼1, 000 selected landmark genes and uses them to reconstruct the whole profile. The D-GEX method employs neural networks to infer the whole profile. However, the original D-GEX can be further significantly improved. Results: We have analyzed the D-GEX method and determined that the inference can be improved using a logistic sigmoid activation function instead of the hyperbolic tangent. Moreover, we propose a nove… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
20
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(20 citation statements)
references
References 376 publications
0
20
0
Order By: Relevance
“…Shen et al [22], used a similar idea of tunable activation function but with multiple outputs. Recently, Kunc and Kléma proposed a transformative adaptive activation functions for gene expression inference, see [14]. One such adaptive activation function is proposed by Jagtap and Karniadakis [6] by introducing scalable hyper-parameter in the activation function, which can be optimized.…”
Section: Introductionmentioning
confidence: 99%
“…Shen et al [22], used a similar idea of tunable activation function but with multiple outputs. Recently, Kunc and Kléma proposed a transformative adaptive activation functions for gene expression inference, see [14]. One such adaptive activation function is proposed by Jagtap and Karniadakis [6] by introducing scalable hyper-parameter in the activation function, which can be optimized.…”
Section: Introductionmentioning
confidence: 99%
“…The extension of the D-GEX [8] significantly improves the gene expression inference firstly by replacing the hyperbolic tangent activation functions with the logistical sigmoids and secondly by including novel transformative adaptive activation functions (TAAFs) which add four additional parameters per neuron that control the scale and translation of the inner activation function (i.e., the sigmoid function). These parameters increase the total number of parameters of the network only slightly as most of a NN's parameters are the weights of the connections; furthermore, the D-GEX with TAAFs outperform the original D-GEX even when the number of neurons in each layer is set such that the total number of parameters is same for both the original and the modified D-GEX variants.…”
Section: D-gexmentioning
confidence: 99%
“…One of the approaches lowering the costs and allowing larger-scale experiments is represented by the LINCS 1 program which developed the L1000 platform based on Luminex bead technology. The L1000 platform measures further improved the quality of the inference by introducing a novel family of adaptive activation functions called transformative adaptive activation functions (TAAFs) that allowed significantly lower error [8].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations