2020
DOI: 10.1109/access.2020.2992084
|View full text |Cite
|
Sign up to set email alerts
|

Combining Deep Neural Networks for Protein Secondary Structure Prediction

Abstract: By combining convolutional neural networks (CNN) and long short term memory networks (LSTM) into the learning structure, this paper presents a supervised learning method called combining deep neural networks (CDNN) for protein secondary structure prediction. First, we use multiple convolutional neural networks with different number of layers and different size of filters to extract the protein secondary structure features. Second, we use bidirectional LSTM to extract features continually based on the raw featu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 28 publications
0
7
0
Order By: Relevance
“…The parameters used for the CD scan were set up as: wavelength: 190.0–260.0 nm, step resolution: 1.0 bandwidth, 1.0 nm, temperature: 25.0°C, and sensitivity: 0.5 s time‐per‐point. The secondary structural elements of hIL‐31 were analyzed by submitting the raw data to software cdnn using the CDNN algorithm with a reference set 20 …”
Section: Methodsmentioning
confidence: 99%
“…The parameters used for the CD scan were set up as: wavelength: 190.0–260.0 nm, step resolution: 1.0 bandwidth, 1.0 nm, temperature: 25.0°C, and sensitivity: 0.5 s time‐per‐point. The secondary structural elements of hIL‐31 were analyzed by submitting the raw data to software cdnn using the CDNN algorithm with a reference set 20 …”
Section: Methodsmentioning
confidence: 99%
“…Generative molecular design [19,52,53], synthesis planning [54], protein structure prediction [55,56] and prediction of properties in drug discovery [57,58] Transformers String notation (encoded as a graph)…”
Section: D Molecular Graph and Point Cloudmentioning
confidence: 99%
“…a few studies, one-letter amino acid sequences were employed for peptide design [52,[143][144][145][146]. RNNs have also been applied to predict ligand-protein interactions and the pharmacokinetic properties of drugs [57,58], protein secondary structure [55,56], and the temporal evolution of molecular trajectories [147]. RNNs have been applied for molecular feature extraction [148,149], showing that the learned features outperformed both traditional molecular descriptors and graph-convolution methods for virtual screening and property prediction [148].…”
Section: Chemical Language Modelsmentioning
confidence: 99%
“…An advantage provided by RMSProp is that it requires less tuning compared to the SGD [36]. Besides, the RMSProp has been used in different DL architectures including to optimise the parameters of a combined deep CNN and LSTM for classifying protein structures [42] as well as the design of a new deep convolutional spiking neural network for time series classification [43].…”
Section: Root Mean Square Propagation (Rmsprop)mentioning
confidence: 99%
“…where A and C are the coefficient vector, t represents the t − th iteration, X is the wolf vector position and X P is the prey vector position. The vectors A and C is obtained by (42) where r 1 and r 2 are random vectors located in the slope of [0, 1] and the value of a lies between 0and2. The GWO has been recently applied in optimising flight models, especially to identify the flight state using CNNs [58] as well as modifying the hidden parameters of the SAE architecture [56].…”
Section: Grey Wolf Optimiser (Gwo)mentioning
confidence: 99%