2023
DOI: 10.3390/electronics12122706
|View full text |Cite
|
Sign up to set email alerts
|

Attention Mechanisms in Convolutional Neural Networks for Nitrogen Treatment Detection in Tomato Leaves Using Hyperspectral Images

Abstract: Nitrogen is an essential macronutrient for the growth and development of tomatoes. However, excess nitrogen fertilization can affect the quality of tomato fruit, making it unattractive to consumers. Consequently, the aim of this study is to develop a method for the early detection of excessive nitrogen fertilizer use in Royal tomato by visible and near-infrared spectroscopy. Spectral reflectance values of tomato leaves were captured at wavelengths between 400 and 1100 nm, collected from several treatments afte… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 41 publications
0
4
0
Order By: Relevance
“…Attention modules have been shown to enhance the feature extraction and generalization performance of neural networks [39,40]. CBAM can be directly embedded into current mainstream CNN network structures, enhancing the feature extraction capabilities without significantly increasing computational and parameter requirements.…”
Section: Deep Residual Neural Networkmentioning
confidence: 99%
“…Attention modules have been shown to enhance the feature extraction and generalization performance of neural networks [39,40]. CBAM can be directly embedded into current mainstream CNN network structures, enhancing the feature extraction capabilities without significantly increasing computational and parameter requirements.…”
Section: Deep Residual Neural Networkmentioning
confidence: 99%
“…SK modules and FCA modules enhanced the effectiveness of spectral attention with different scales of features [156][157][158] and frequency features [166][167][168], respectively. By introducing adaptive average pooling [160] and global max-pooling [161][162][163][164], PA modules and spe-CBAMs perceived different scales of contextual information and global salient responses, separately. These attention modules have become powerful means to capture salient bands for discriminating spectral features.…”
Section: Spectral Attentionmentioning
confidence: 99%
“…GE modules [169] utilize depth-wise convolution to gather and assess the correlations between spectral features in small regions and resize aggregated weights for adjustment [170][171][172]. To consider more useful information of input, spa-CBAMs [147] introduce global average pooling and max-pooling layers before convolution, which improve spatial attention without increasing the number of parameters [161][162][163][164][173][174][175]. Different from GE modules and CBAMs, 1 × 1 × 1 convolutional layers were exploited in BAMs [176] to compress and transform the information in spectral and channel dimensions [177][178][179], which enhanced the adaptation of spatial attention.…”
Section: Convolution-based Spatial Attentionmentioning
confidence: 99%
See 1 more Smart Citation