2020
DOI: 10.1080/2150704x.2020.1730472
|View full text |Cite
|
Sign up to set email alerts
|

A lossless lightweight CNN design for SAR target recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
23
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 34 publications
(23 citation statements)
references
References 13 publications
0
23
0
Order By: Relevance
“…CNN [16] automatically extracts image features; thus, the complex feature extraction and data reconstruction process of conventional recognition algorithms can be avoided. AlexNet [17], VGGNet [18][19][20][21], GoogLeNet [22,23], ResNet [24][25][26][27], and other networks can be adopted for vehicle type recognition, whereas for the limitations of sample quality and quantity as well as the defects of network feature extraction and classification performance, vehicle recognition exhibits relatively low accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…CNN [16] automatically extracts image features; thus, the complex feature extraction and data reconstruction process of conventional recognition algorithms can be avoided. AlexNet [17], VGGNet [18][19][20][21], GoogLeNet [22,23], ResNet [24][25][26][27], and other networks can be adopted for vehicle type recognition, whereas for the limitations of sample quality and quantity as well as the defects of network feature extraction and classification performance, vehicle recognition exhibits relatively low accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…Chen et al [28] combined a weight-based pruning with an adaptive architecture squeezing to obtain a high compression ratio in CNN, and utilized pruning to find an appropriate squeezing ratio. In [29], a lossless lightweight CNN design strategy is explored for the SAR target recognition by using the structured pruning and the knowledge distillation. Pruning methods can reduce the amount of parameters while reducing the computational complexity, but it is possible that the accuracy is degraded due to the retraining of each network layer iteratively.…”
Section: Related Workmentioning
confidence: 99%
“…Parameter pruning [26], [36] yields an effective reduction of network size; however, it relies on the support of hardware and a computing library because of sparse parameter kernel after pruning. Convolutional filter pruning [26], [36], [52], [53] resolves the unfriendliness of hardware and Basic Linear Algebra Subprograms (BLAS library). However, the problem of drop accuracy is commonly encountered in both approaches.…”
Section: B Network Compression For Image Recognitionmentioning
confidence: 99%