2021
DOI: 10.18494/sam.2021.3456
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Modulation Recognition Method Based on Hybrid Model of Convolutional Neural Networks and Gated Recurrent Units

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(9 citation statements)
references
References 15 publications
(19 reference statements)
0
9
0
Order By: Relevance
“…The architecture of our proposed RLADNN is shown in Figure 1. Our model starts from a CNN with a convolutional kernel size of (2,8), which consists of two convolutional layers, a residual structure layer, two LSTM layers, and an Attention layer, and the whole is mainly presented as a tandem arrangement. The superiority of CNN in extracting spatial features, and generating high-level information makes CNN widely used in the field of image recognition.…”
Section: Rladnn Modelmentioning
confidence: 99%
See 3 more Smart Citations
“…The architecture of our proposed RLADNN is shown in Figure 1. Our model starts from a CNN with a convolutional kernel size of (2,8), which consists of two convolutional layers, a residual structure layer, two LSTM layers, and an Attention layer, and the whole is mainly presented as a tandem arrangement. The superiority of CNN in extracting spatial features, and generating high-level information makes CNN widely used in the field of image recognition.…”
Section: Rladnn Modelmentioning
confidence: 99%
“…The ratio is 6:2:2. In this letter, the publicly available dataset RadioML2016.10a from [3] is used to evaluate the DL models, we chose the following typical DL-based AMR algorithms as benchmarks, which contain various types of mainstream frameworks for DL and each has its own advantages (CNN-LSTM [9], Complex-CNN [15], improved convolutional neural network (CNN)-based automatic modulation classification network (IC-AMCNet) [16], Convolutional Long-Short Term Deep Neural Network (CLDNN) [17], Convolutional Neural Networks and Long-Short-Term Memory Networks (RCTLNet) [18], and CNN-GRU hybrid network model (CGRNet) [19]). As shown in Figure 2, even compared with the current state-of-the-art model, the recognition rate of the model proposed in this paper has improved significantly over the entire SNR range, and when Attention is added, it results in a more substantial performance improvement at [−6:18] SNR based on the original model, but with only a small increase in time and number of parameters.…”
Section: Rladnn Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…S. Chang et al [14] combined CNN with Bi GRU networks achieving an accuracy of 84% at high SNR. Q. Duan et al [18] used combination of CNN with BiLSTM and Attention models achieving an accuracy of 93% at high SNR. J. Xu et al [11] combined CNN and LSTM with FC models achieving an accuracy of 90% at high SNR.…”
Section: Introductionmentioning
confidence: 99%