2016
DOI: 10.48550/arxiv.1602.04105
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Convolutional Radio Modulation Recognition Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
4

Relationship

3
5

Authors

Journals

citations
Cited by 11 publications
(16 citation statements)
references
References 0 publications
0
11
0
Order By: Relevance
“…We build a model which is not prone to trivial parser based security vulnerabilities and we form a model which does not incur cost and complexity to development which scales with the number of specific protocols implemented since they are derived from datasets using a model that generalizes well. We have previously demonstrated [13] that this class of approach using deep neural networks to learn a radio discrimination task on low level modulations can be highly effective, but in this work we show that this potential also spans up the stack to higher layer traffic types as well.…”
Section: Introductionmentioning
confidence: 79%
“…We build a model which is not prone to trivial parser based security vulnerabilities and we form a model which does not incur cost and complexity to development which scales with the number of specific protocols implemented since they are derived from datasets using a model that generalizes well. We have previously demonstrated [13] that this class of approach using deep neural networks to learn a radio discrimination task on low level modulations can be highly effective, but in this work we show that this potential also spans up the stack to higher layer traffic types as well.…”
Section: Introductionmentioning
confidence: 79%
“…For the application of non-cooperative communication, the receiver does not know the current modulation type, thus it needs to firstly perform modulation recognition, as shown in the dashed box in the Figure 1 . We chose the automatic modulation classification (AMC) module proposed in [ 34 ] to finish the modulation recognition.…”
Section: Communication System Modelmentioning
confidence: 99%
“…In the case that we do have some expertly labeled data, we can also generate a sparse representation space using discriminative features learned during supervised training. In prior work [14] we trained a convolutional neural network in a purely supervised way to to label examples, but here we leverage this trained network and discard the final softmax layer to keep only high level learned feature maps as sparse representations. The networks used in supervised training and extraction of sparse representation using the learned featuremaps are shown in figure 4 Features formed in this way leverage and distill available expertly curated labels, but in many cases they also generalize and provide a capacity to separate additional classes in feature space which do not have class labels.…”
Section: B Supervised Bootstrapping Of Sparse Representationmentioning
confidence: 99%
“…We recently demonstrated the viability of naive feature learning for supervised radio classification systems [14] which allows for joint feature and classifier learning given labeled datasets and examples. In this case we were able to outperform traditional expert decision statistic based classification in sensitivity and accuracy by a significant margin.…”
Section: Introductionmentioning
confidence: 99%