2017 IEEE 85th Vehicular Technology Conference (VTC Spring) 2017
DOI: 10.1109/vtcspring.2017.8108670
|View full text |Cite
|
Sign up to set email alerts
|

Graphic Constellations and DBN Based Automatic Modulation Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
20
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 43 publications
(20 citation statements)
references
References 9 publications
0
20
0
Order By: Relevance
“…Hidden layers maybe one layer or multilayer, and each layer consists of several nodes. The [26,37] (ii) KNN [38,91] (iii) SVM [6,27,47,48,92] (iv) Naïve Bayes [39] (v) HMM [46] (vi) Fuzzy classifier [93] (vii) Polynomial classifier [40,94] (i) DNN [24,30,31,61] (ii) DBN [49,63] (iii) CNN [17, 19-21, 54, 64, 65, 70, 73-76, 79, 81, 82, 95, 96] (iv) LSTM [29,69] (v) CRBM [53] (vi) Autoencoder network [50,62] (vii) Generative adversarial networks [66,67] (viii) HDMF [71,72] (ix) NFSC [78] Pros (i) works better on small data (ii) low implementation cost (i) simple pre-processing (ii) high accuracy and efficiency (iii) adaptive to different applications Cons (i) time demanding (ii) complex feature engineering (iii) depends heavily on the representation of the data (iv) prone to curse of dimensionality (i) demanding large amounts of data (ii) high hardware cost node presented in Figure 3 is the basic operational unit, in which the input vector is multiplied by a series of weights and the sum value is fed into the activation function . These operational units contribute to a powerful network, which could realize complex functions such as regression and classification.…”
Section: Definition Of DL Problemmentioning
confidence: 99%
See 2 more Smart Citations
“…Hidden layers maybe one layer or multilayer, and each layer consists of several nodes. The [26,37] (ii) KNN [38,91] (iii) SVM [6,27,47,48,92] (iv) Naïve Bayes [39] (v) HMM [46] (vi) Fuzzy classifier [93] (vii) Polynomial classifier [40,94] (i) DNN [24,30,31,61] (ii) DBN [49,63] (iii) CNN [17, 19-21, 54, 64, 65, 70, 73-76, 79, 81, 82, 95, 96] (iv) LSTM [29,69] (v) CRBM [53] (vi) Autoencoder network [50,62] (vii) Generative adversarial networks [66,67] (viii) HDMF [71,72] (ix) NFSC [78] Pros (i) works better on small data (ii) low implementation cost (i) simple pre-processing (ii) high accuracy and efficiency (iii) adaptive to different applications Cons (i) time demanding (ii) complex feature engineering (iii) depends heavily on the representation of the data (iv) prone to curse of dimensionality (i) demanding large amounts of data (ii) high hardware cost node presented in Figure 3 is the basic operational unit, in which the input vector is multiplied by a series of weights and the sum value is fed into the activation function . These operational units contribute to a powerful network, which could realize complex functions such as regression and classification.…”
Section: Definition Of DL Problemmentioning
confidence: 99%
“…In [61,62], the authors use the DL network combined with IQ components of constellation points as features. In [63], a simple graphic constellation projection (GCP) scheme for AMC is presented. Unlike FB approaches, the AMC task is turned to image recognition technology.…”
Section: Constellation Shapementioning
confidence: 99%
See 1 more Smart Citation
“…Considering that the received signal can be separated by two part: the imaginary part and the real part [17] , we often project the raw "IQ" signal into cartesian coordinate system through mapping the I component to the X-axis, and the Q component to the Y-axis. I, Q components denote the imaginary and real part of the received complex signal, respectively.…”
Section: Dmfn Based Modulation Classification 31 Grid Constellation mentioning
confidence: 99%
“…A variety of features were extracted and employed in [6]- [18], containing amplitude with phase and carrier frequency [6], instantaneous features [7], high-order statistical features [8], [9], cyclic spectrum parameters [10], [11], bispectrum features [12], wavelet features [13], [14] and constellation diagram [15], [16]. For the choice of the training classifier, the classifiers based on machine learning like support vector machine (SVM) in [6], [13], [17], [18], decision tree in [7], [8], [14], k nearest neighbor (KNN) in [10], compressive sensing in [12], genetic algorithm in [15], and neural network (NN) in [9], [11], [16], were widely used due to their robustness, self-adaption, and nonlinear processing ability [19]. However, the performance of these PR methods largely depends on empirical feature extraction due to the limited capacity of classifiers [19].…”
Section: Introductionmentioning
confidence: 99%