2021
DOI: 10.1109/tip.2020.3036770
|View full text |Cite
|
Sign up to set email alerts
|

A Lightweight Spatial and Temporal Multi-Feature Fusion Network for Defect Detection

Abstract: This paper proposes a hybrid multi-dimensional features fusion structure of spatial and temporal segmentation model for automated thermography defects detection. In addition, the newly designed attention block encourages local interaction among the neighboring pixels to recalibrate the feature maps adaptively. A Sequence-PCA layer is embedded in the network to provide enhanced semantic information. The final model results in a lightweight structure with smaller number of parameters and yet yields uncompromisin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
10

Relationship

2
8

Authors

Journals

citations
Cited by 64 publications
(17 citation statements)
references
References 31 publications
0
17
0
Order By: Relevance
“…Experiments were established to identify the optimal batch size, epochs, and the number of nodes for individual machine learning model i.e., LR, ANN with a single hidden layer (ANN1), ANN with 2 hidden layers (ANN2), ANN with 3 hidden layers (ANN3), GRU, LSTM, Bi-LSTM. The optimal parameterization was conducted according to the following sets [ 55 ] i.e., batch size: [8, 16, 32, 64, 128], epochs: [10, 20, 30, 40, 50], and each hidden layer the number of neurons were set as [8, 16, 32, 64, 128, 256, 512]. For all the experiments the parameters of the ANN were configured as the following: an initializing function is a uniform function; activation functions used the ReLU function in hidden layer and the sigmoid function for the output layer.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…Experiments were established to identify the optimal batch size, epochs, and the number of nodes for individual machine learning model i.e., LR, ANN with a single hidden layer (ANN1), ANN with 2 hidden layers (ANN2), ANN with 3 hidden layers (ANN3), GRU, LSTM, Bi-LSTM. The optimal parameterization was conducted according to the following sets [ 55 ] i.e., batch size: [8, 16, 32, 64, 128], epochs: [10, 20, 30, 40, 50], and each hidden layer the number of neurons were set as [8, 16, 32, 64, 128, 256, 512]. For all the experiments the parameters of the ANN were configured as the following: an initializing function is a uniform function; activation functions used the ReLU function in hidden layer and the sigmoid function for the output layer.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…Sun et al [24] proposed an adaptive saliency-biased loss (ASBL) to train the RetinaNet and dramatically improved the performance of detection in the ORSIs. In addition, the work in [25,26] proposed the advanced object detection architecture that involves both spatial and temporal domain information in the decision. However, these axis-aligned bounding box object detectors are still confronted with the challenge of arbitrary orientations in ORSIs.…”
Section: One-stage Object Detection Methodsmentioning
confidence: 99%
“…Through modifying the GAN loss and penalty loss, the training process detection rate is significantly improved. Hu et al [ 170 ] embedded a sequence-PCA (Principal Component Analysis) layer and designed a new attention block to a deep learning network for automated thermography defects detection. The tested results verify that the proposed model can capture semantic information better and improve the detection rate in an end-to-end procedure.…”
Section: Data Managementmentioning
confidence: 99%