2022
DOI: 10.3390/e24091186
|View full text |Cite
|
Sign up to set email alerts
|

Working Condition Recognition Based on Transfer Learning and Attention Mechanism for a Rotary Kiln

Abstract: It is difficult to identify the working conditions of the rotary kilns due to the harsh environment in the kilns. The flame images of the firing zone in the kilns contain a lot of working condition information, but the flame image data sample size is too small to be used to fully extract the key features. In order to solve this problem, a method combining transfer learning and attention mechanism is proposed to extract key features of flame images, in which the deep residual network is used as the backbone net… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

1
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 32 publications
1
1
0
Order By: Relevance
“…Numerous studies have been conducted to address the issue of small sample learning. Transfer learning has been widely promoted and applied as a solution, with pretrained models based on ImageNet being utilized to adapt the fully connected structure for classification problems. These studies have shown that pretrained model parameters exhibit high accuracy in the target domain . Similar studies have confirmed this phenomenon, indicating that pretraining on large data sets enables the model to summarize the key features of image data through multiple layers of neural network mapping, resulting in strong generalization performance.…”
Section: Introductionsupporting
confidence: 53%
See 1 more Smart Citation
“…Numerous studies have been conducted to address the issue of small sample learning. Transfer learning has been widely promoted and applied as a solution, with pretrained models based on ImageNet being utilized to adapt the fully connected structure for classification problems. These studies have shown that pretrained model parameters exhibit high accuracy in the target domain . Similar studies have confirmed this phenomenon, indicating that pretraining on large data sets enables the model to summarize the key features of image data through multiple layers of neural network mapping, resulting in strong generalization performance.…”
Section: Introductionsupporting
confidence: 53%
“…These studies have shown that pretrained model parameters exhibit high accuracy in the target domain. 23 Similar studies have confirmed this phenomenon, 24 indicating that pretraining on large data sets enables the model to summarize the key features of image data through multiple layers of neural network mapping, resulting in strong generalization performance. This insight is particularly valuable for small sample data sets.…”
Section: Introductionmentioning
confidence: 76%