2019
DOI: 10.3390/rs11242892
|View full text |Cite
|
Sign up to set email alerts
|

Multiple Kernel Feature Line Embedding for Hyperspectral Image Classification

Abstract: In this study, a novel multple kernel FLE (MKFLE) based on general nearest feature line embedding (FLE) transformation is proposed and applied to classify hyperspectral image (HSI) in which the advantage of multple kernel learning is considered. The FLE has successfully shown its discriminative capability in many applications. However, since the conventional linear-based principle component analysis (PCA) pre-processing method in FLE cannot effectively extract the nonlinear information, the multiple kernel PCA… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
5
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 22 publications
0
5
0
Order By: Relevance
“…Imani et al used a combination of median-mean line (MML) and feature line (FL) metrics to eliminate the negative impact of outliers on the class mean and improve the efficiency of the algorithm dimensional reduction [25]. In addition, scholars have also proposed extended versions of feature spacebased algorithms, such as orthogonal nearest neighbor feature line embedding (ONNFLE) [26], fuzzy kernel NFLE (FKN-FLE) [27], multiple kernel feature line embedding (MKFLE) [28], space-to-space (S2S)-based metric learning (FSDML) [29], support vector machine (SVMFLE)-based feature line embedding [30], and other methods.…”
Section: Introductionmentioning
confidence: 99%
“…Imani et al used a combination of median-mean line (MML) and feature line (FL) metrics to eliminate the negative impact of outliers on the class mean and improve the efficiency of the algorithm dimensional reduction [25]. In addition, scholars have also proposed extended versions of feature spacebased algorithms, such as orthogonal nearest neighbor feature line embedding (ONNFLE) [26], fuzzy kernel NFLE (FKN-FLE) [27], multiple kernel feature line embedding (MKFLE) [28], space-to-space (S2S)-based metric learning (FSDML) [29], support vector machine (SVMFLE)-based feature line embedding [30], and other methods.…”
Section: Introductionmentioning
confidence: 99%
“…Algorithms that rely on sample point-to-point (P2P) metric learning have restricted generalization capabilities and are incapable of extracting more discriminative information for subsequent fault classification. Hence, nearest feature space embedding (NFSE) [6], weighted feature line embedding (WFLE) [1], multiple kernel feature line embedding (MKFLE) [4], and others based on point-to-space (P2S) metrics have been introduced. These algorithms utilize the P2S metric, which not only compresses the feature space dimension but also enables the extraction of additional information for fault identification.…”
Section: Introductionmentioning
confidence: 99%
“…To perceive the relationships among the graph node and its neighbors, graph embedding is employed, which learns to represent graph nodes with n-dimensional vectors. Graph embedding has a close connection with methods of the representation-based classification (e.g., some up-to-date works, such as LMRKNN [24], TPCRC [25], the novel DCRC method via l2 regularizations [26] and MKFLE [27]). Inspired by the representation learning, graph embedding methods on graph domain (e.g., DeepWalk [28], node2vec [29], LINE [30] and SDNE [31]) and some specific methods like those seen in [32] and [15] are proposed, which accomplish information aggregation based on the adjacent relationships among nodes on graphs.…”
Section: Introductionmentioning
confidence: 99%