2021
DOI: 10.1016/j.patcog.2021.107967
|View full text |Cite
|
Sign up to set email alerts
|

Nonlocal graph theory based transductive learning for hyperspectral image classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 33 publications
(13 citation statements)
references
References 38 publications
0
10
0
Order By: Relevance
“…Jasper dataset has four endmembers' contributions corresponding to tree, soil, water, and road. A graph-based architecture is proposed in [33] to classify the endmember contributions and, the authors compare the classification performance with ML techniques such as SVM, KNN, LDAKNN, PCAKNN, KPCAKNN, LDASVM, PCASVM KPCASVM, Convolution Neural Network (CNN), the abbreviation Linear Discriminant Analysis (LDA), Kernel PCA (KPCA) in the machine learning methods means the preprocessing step before the classification techniques. The accuracy measurements for Jasper image for classification into the endmembers contribution by class is as follows: Soil Class obtained 100% accuracy results using the PCAKNN method and SVM 99.905%, for water, and TLM-2 classifier obtained 98.959%, Finally, for tree class, TLM-2 obtained 97.622%.…”
Section: Discussion Of Ensemble Model Results For Jasper Hsimentioning
confidence: 99%
See 1 more Smart Citation
“…Jasper dataset has four endmembers' contributions corresponding to tree, soil, water, and road. A graph-based architecture is proposed in [33] to classify the endmember contributions and, the authors compare the classification performance with ML techniques such as SVM, KNN, LDAKNN, PCAKNN, KPCAKNN, LDASVM, PCASVM KPCASVM, Convolution Neural Network (CNN), the abbreviation Linear Discriminant Analysis (LDA), Kernel PCA (KPCA) in the machine learning methods means the preprocessing step before the classification techniques. The accuracy measurements for Jasper image for classification into the endmembers contribution by class is as follows: Soil Class obtained 100% accuracy results using the PCAKNN method and SVM 99.905%, for water, and TLM-2 classifier obtained 98.959%, Finally, for tree class, TLM-2 obtained 97.622%.…”
Section: Discussion Of Ensemble Model Results For Jasper Hsimentioning
confidence: 99%
“…On the other hand, for two classes using as a label sub-set of non-belong (NB) and belong (B), we obtained 91.67%, 100.00%, and 100.00% for the three classes of trees, water, and soil, respectively. We improve the results compared to the method proposed in [33] for our two sub-labels approach. The best scaling method for Jasper dataset was the min-max-scaling.…”
Section: Discussion Of Ensemble Model Results For Jasper Hsimentioning
confidence: 99%
“…For heterogeneous areas in HSI, the fixed small region may come from different categories, so only using a fixed size square window will not be feasible [43]. The nonlocal neighborhood idea [44] facilitates us to adopt many diverse regions to integrate the spectral and spatial information, the detailed pipeline of nonlocal diverse regions is shown in Figure 2. Now, spectral-spatial classification techniques via the neural network can use the spatial correlation of adjacent pixels [45].…”
Section: A Nonlocal Spectral Sequence Datamentioning
confidence: 99%
“…Therefore, SSL is gaining researchers' interest. 41 Self-training, 42 Active Learning (AL), 43 graphbased learning 44 and transductive learning 45 are some of the semi-supervised learning methodologies which are used to increase learning performance. The primary goal of all investigations is to increase the training sample size.…”
Section: Semi-supervised Approaches For Classificationmentioning
confidence: 99%
“…The transductive methodology necessitates fewer resources because it is primarily concerned with lowering inference error for a given set of unlabelled data, rather than attempting to enhance the overall quality of the acquired hypothesis. Hence, Huang et al 45 proposed a strategy named transductive learning to overcome the limited labelling training sample and high dimensionality difficulties by extending non-local graph theory to the classification label space. Furthermore, Appice et al 53 used the S2CoTraC algorithm to implement a transductive learning technique based on a co-training schema.…”
Section: Semi-supervised Approaches For Classificationmentioning
confidence: 99%