2022
DOI: 10.1109/lgrs.2022.3141870
|View full text |Cite
|
Sign up to set email alerts
|

Self-Supervised Convolutional Neural Network via Spectral Attention Module for Hyperspectral Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…In this technique, only unclassified data are applied to develop pretext learning assignment (e.g., image rotation, context prediction, etc. ), whereby the target may be computed unsupervised [68][69][70][71]. An example of this learning type refers to autoencoders; an NN that develops compact input sample representation [72,73].…”
Section: Self-supervised Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…In this technique, only unclassified data are applied to develop pretext learning assignment (e.g., image rotation, context prediction, etc. ), whereby the target may be computed unsupervised [68][69][70][71]. An example of this learning type refers to autoencoders; an NN that develops compact input sample representation [72,73].…”
Section: Self-supervised Learningmentioning
confidence: 99%
“…The drawbacks of CNN are intense human interference, local minima, and slow convergence rate. The great success achieved by ImageNet models led CNNs to improve their efficacy in several domains [71,[163][164][165][166][167].…”
Section: Convolutional Neural Network (Cnn)mentioning
confidence: 99%