2020
DOI: 10.1002/int.22302
|View full text |Cite
|
Sign up to set email alerts
|

One‐dimensional convolutional neural networks for high‐resolution range profile recognition via adaptively feature recalibrating and automatically channel pruning

Abstract: High-resolution range profile (HRRP) has obtained intensive attention in radar target recognition and convolutional neural networks (CNNs) are among predominant approaches to deal with HRRP recognition problems. However, most CNNs are designed by the rule-of-thumb and suffer from much more computational complexity. Aiming at enhancing the channels of one-dimensional CNN (1D-CNN) for extracting efficient structural information oftargets form HRRP and reducing the computation complexity, we propose a novel frame… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 27 publications
(25 citation statements)
references
References 39 publications
0
16
0
Order By: Relevance
“…They defined the embedding table search space by blocks and adjusted the controller with the validation set, making the model automatically determine the embedding size by maximizing the accuracy under the constraint of the embedding table memory. On top of that, some other neural search approaches such as Neural Architecture Search (NAS), 17 Efficient Neural Architecture Search (ENAS), 18 and Differentiable Architecture Search (DAS) 19 are also utilized to automatically determine the size of embeddings and some of them have been deployed in the industry media recommender systems and achieved certain benefits. Model Compression: 20,21 Common model compression technologies mainly includes pruning, 22 quantization 23 and distillation 24 . For example, Liu et al 10 compressed the embedding table by pruning the embedded vectors of each feature domain of the data.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…They defined the embedding table search space by blocks and adjusted the controller with the validation set, making the model automatically determine the embedding size by maximizing the accuracy under the constraint of the embedding table memory. On top of that, some other neural search approaches such as Neural Architecture Search (NAS), 17 Efficient Neural Architecture Search (ENAS), 18 and Differentiable Architecture Search (DAS) 19 are also utilized to automatically determine the size of embeddings and some of them have been deployed in the industry media recommender systems and achieved certain benefits. Model Compression: 20,21 Common model compression technologies mainly includes pruning, 22 quantization 23 and distillation 24 . For example, Liu et al 10 compressed the embedding table by pruning the embedded vectors of each feature domain of the data.…”
Section: Related Workmentioning
confidence: 99%
“…Model Compression: 20,21 Common model compression technologies mainly includes pruning, 22 quantization 23 and distillation 24 . For example, Liu et al 10 compressed the embedding table by pruning the embedded vectors of each feature domain of the data.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Model Compression. 20;21 Common model compression technologies mainly includes pruning 22 , quantization 23 and distillation 24 . For example, Liu et al 10 compressed the embedding table by pruning the embedded vectors of each feature domain of the data.…”
Section: Lightweight Embeddingmentioning
confidence: 99%
“…2D CNNs have been successfully used to learn and reconstruct features from raw data and have developed into the dominant approach for accomplishing recognition and detection tasks of image and speech analysis [33]. Due to the good characteristics of CNN learning, 1D CNN has been proposed to address 1D signals based on 2D CNN and has achieved superior performance with high efficiency [34,35]. To adapt to the data characteristics of 1D signals, comparing 2D CNNs, the hierarchical architecture of 1D CNNs is simplified [36].…”
Section: Introductionmentioning
confidence: 99%