The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2023
DOI: 10.1101/2023.04.26.538471
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Evaluation and optimization of sequence-based gene regulatory deep learning models

Abstract: Neural networks have proven to be an immensely powerful tool in predicting functional genomic regions, in particular with many recent successes in deciphering gene regulatory logic. However, how model architecture and training strategy choices affect model performance has not been systematically evaluated for genomics models. To address this gap, we held a DREAM Challenge where competitors trained models on a dataset of millions of random promoter DNA sequences and corresponding experimentally determined expre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 72 publications
(172 reference statements)
0
2
0
Order By: Relevance
“…Finally, having standardized datasets and modeling competitions (e.g. DREAM Challenges) will facilitate the continued improvement of model efficiency and accuracy [160][161][162] .…”
Section: Moving Forwardsmentioning
confidence: 99%
“…Finally, having standardized datasets and modeling competitions (e.g. DREAM Challenges) will facilitate the continued improvement of model efficiency and accuracy [160][161][162] .…”
Section: Moving Forwardsmentioning
confidence: 99%
“…Several studies have embedded such models into algorithms for discovery of new variants using optimization methods [8] and techniques from generative models [14,15,16,17]. Although the current literature has a strong focus on improvements to model architectures that can deliver greater predictive power [18], with the size of sequence-to-expression datasets growing into thousands up to millions of variants, it is becoming increasingly clear that o↵the-shelf deep learning architectures such as convolutional neural networks, recurrent neural networks or transformers can readily provide high predictive accuracy [19].…”
Section: Introductionmentioning
confidence: 99%