2022
DOI: 10.1007/978-3-030-96311-8_4
|View full text |Cite
|
Sign up to set email alerts
|

Residual Neural Network for Predicting Super-Enhancers on Genome Scale

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…For , we demonstrate its applicability on basecalling only, while there are other genome sequencing tasks where deep learning models with skip connections are actively being developed, such as predicting the effect of genetic variations [73, 81], detecting replication dynamics [82], and predicting super-enhancers [83]. In Supplementary S2.1, we show the effect of manual skip removal, where we manually remove all the skip connections at once.…”
Section: Discussionmentioning
confidence: 99%
“…For , we demonstrate its applicability on basecalling only, while there are other genome sequencing tasks where deep learning models with skip connections are actively being developed, such as predicting the effect of genetic variations [73, 81], detecting replication dynamics [82], and predicting super-enhancers [83]. In Supplementary S2.1, we show the effect of manual skip removal, where we manually remove all the skip connections at once.…”
Section: Discussionmentioning
confidence: 99%
“…For SkipClip, we demonstrate its applicability on basecalling only, while there are other genome sequencing tasks where deep learning models with skip connections are actively being developed, such as predicting the e ect of genetic variations [69,82], detecting replication dynamics [83], and predicting super-enhancers [84]. In Supplementary S1, we show the e ect of manual skip removal, where we manually remove all the skip connections at once.…”
Section: Discussionmentioning
confidence: 96%
“…Subsequently, aiming to improve the DEEPSEN method, Sabba et al . developed the ResSEN method using a residual neural network (ResNet) to optimize CNN performance and avoid the vanishing gradient problem ( Table 3 ) [ 95 ]. ResNet is a method of learning deeper CNN by residual learning that introduces residual blocks to avoid degradation problems caused by vanishing gradient problems and other problems that occur when layers are deepened in CNNs [ 96 , 97 ].…”
Section: Introductionmentioning
confidence: 99%
“…In order to validate the performance of the ResSEN method, a comparison was made between the ResSEN and DEEPSEN methods using the same data as the DEEPSEN method, with 90% of the data for training and 10% of the data for testing. As a result, overfitting was observed in the DEEPSEN method but not in the ResSEN method [ 95 ]. The accuracy was also better when predicted by the ResSEN method, indicating that changing the algorithm used for ResNet increases the prediction accuracy of the SE [ 95 ].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation