2021
DOI: 10.1109/access.2021.3091421
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning-Based Robust Automatic Modulation Classification for Cognitive Radio Networks

Abstract: In this paper, a novel deep learning-based robust automatic modulation classification (AMC) method is proposed for cognitive radio networks. Generally, as network input of AMC convolutional neural networks (CNNs) images or complex signals are utilized in time domain or frequency domain. In terms of the image that contains RGB(Red, Green, Blue) levels the input size may be larger than the complex signal, which represents the increase of computational complexity. In terms of the complex signal it is normally use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 25 publications
(29 citation statements)
references
References 31 publications
0
17
0
Order By: Relevance
“…Zheng et al [19] compared three fussion methods for handling variation in the length of the signal passed as input to the CNN classifier. In [20], a CNN classifier was trained with reduced signal dimensions and was shown to outperform some ML-based models. Lin et al [21] proposed a combination of gated recurrent unit and CNN to detect temporal features.…”
Section: Related Workmentioning
confidence: 99%
“…Zheng et al [19] compared three fussion methods for handling variation in the length of the signal passed as input to the CNN classifier. In [20], a CNN classifier was trained with reduced signal dimensions and was shown to outperform some ML-based models. Lin et al [21] proposed a combination of gated recurrent unit and CNN to detect temporal features.…”
Section: Related Workmentioning
confidence: 99%
“…Inspired by ECNN [10], before training the module, the whole dataset S is extended as 2 × 2N by copying data and concatenating in reverse order to improve the recognition accuracy:…”
Section: Problem Statement and Basicmentioning
confidence: 99%
“…MCBL is trained from scratch with randomly initialized weights using adaptive moment (Adam) optimizer. The [37], CLDNN [19], VT_CNN2 [18], LSTM [38], ResNet [39], VGG [40], CNN_LSTM [41], and two latest models ECNN [10] and CGDNet [9]. All of these algorithms use the same dataset without any preprocessing.…”
Section: Experimental Settingsmentioning
confidence: 99%
See 1 more Smart Citation
“…ERROR-PRONE CSI AT THE PPP'S RECEIVER-ENDError-prone channel estimation are prevalent as a result of the stipulated requirements of the minimum mean-square error when they are being projected. Recently, authors in[33]-[37]. have modeled and projected these error-prone CSI to be h 1 =ĥ 1 + h e , where the receiver side's estimated CSI isĥ 1 , the zero-mean value is h e , the Gaussian estimationVOLUME xx, 201x…”
mentioning
confidence: 99%