2023
DOI: 10.1109/jstars.2023.3288521
|View full text |Cite
|
Sign up to set email alerts
|

Expansion Spectral–Spatial Attention Network for Hyperspectral Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 48 publications
0
0
0
Order By: Relevance
“…To address these issues raised regarding CNNs, such as parameter redundancy and a limited ability to handle spatial and scale variations, researchers have introduced various CNN variants to improve the models and enhance the performance and adaptability of CNNs. Wang et al [34] proposed a model known as the expansion spectral-spatial attention network. It introduces expansion convolution as a dual-branch feature extraction unit, which expands the receptive field of input patches, enhancing the perception of large-scale features.…”
Section: Introductionmentioning
confidence: 99%
“…To address these issues raised regarding CNNs, such as parameter redundancy and a limited ability to handle spatial and scale variations, researchers have introduced various CNN variants to improve the models and enhance the performance and adaptability of CNNs. Wang et al [34] proposed a model known as the expansion spectral-spatial attention network. It introduces expansion convolution as a dual-branch feature extraction unit, which expands the receptive field of input patches, enhancing the perception of large-scale features.…”
Section: Introductionmentioning
confidence: 99%
“…However, they treat all features equally without considering different significances. Moreover, capturing global contextual information and establishing long-range dependencies can be inefficiently limited by their inherent structure [36]. (2) Transformers are good at global interaction and capturing salient features.…”
Section: Introductionmentioning
confidence: 99%