2019
DOI: 10.2528/pierc18120305
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Lightweight Sarnet With Clock-Wise Data Amplification for Sar Atr

Abstract: Convolutional Neural Network (CNN) models applied to synthetic aperture radar automatic target recognition (SAR ATR) universally focus on two important issues: overfitting caused by lack of sufficient training data and independent variations like worse estimates of the aspect angle, etc. To this end, we developed a lightweight CNN-based method named SARNet to accomplish the classification task. Firstly, a clockwise data amplification approach is presented to generate adequate SAR images without requiring many … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 33 publications
0
2
0
Order By: Relevance
“…AF-CNN [44] is an additional feature-based CNN architecture which does not need additional preprocessing process or pose information, which boosts 4.38% improvement compared with SVM. e traditional CNN [ Method Accuracy (%) SVM [41] 90.00 AlexNet 93.55 AE&LSVM [42] 94.14 DNPP-L1 [43] 94.14 AF-CNN [44] 94.38 Gabor & LPQ & ELM [45] 94.80 CNN [46] 95.90 Unsupervised K-means & data amplification [47] 96.67 LeNet [48] 97.29 ResNet-50 [24] 97.66 SARNet [49] 98.30 JLSND&SRC 1 [50] 98.30 Proposed 98.93…”
Section: Performance Comparison With State-of-the-art Algorithmsmentioning
confidence: 99%
“…AF-CNN [44] is an additional feature-based CNN architecture which does not need additional preprocessing process or pose information, which boosts 4.38% improvement compared with SVM. e traditional CNN [ Method Accuracy (%) SVM [41] 90.00 AlexNet 93.55 AE&LSVM [42] 94.14 DNPP-L1 [43] 94.14 AF-CNN [44] 94.38 Gabor & LPQ & ELM [45] 94.80 CNN [46] 95.90 Unsupervised K-means & data amplification [47] 96.67 LeNet [48] 97.29 ResNet-50 [24] 97.66 SARNet [49] 98.30 JLSND&SRC 1 [50] 98.30 Proposed 98.93…”
Section: Performance Comparison With State-of-the-art Algorithmsmentioning
confidence: 99%
“…There are various methods to reduce the weight of the model, such as adding an attention mechanism, network pruning [20], and separable convolution [21]. For example, in [22], in 2019, Zhao et al proposed a lightweight CNN model to avoid the over fitting problem caused by the lack of data, and achieved 98.30% accuracy on MSTAR data set. In [23], Ying used selfattention and knowledge distillation to achieve weight reduction of the model, and good results were achieved on the MSTAR dataset.…”
Section: Introductionmentioning
confidence: 99%