2019 IEEE International Conference on Data Mining (ICDM) 2019
DOI: 10.1109/icdm.2019.00094
|View full text |Cite
|
Sign up to set email alerts
|

Learning Structured Twin-Incoherent Twin-Projective Latent Dictionary Pairs for Classification

Abstract: In this paper, we extend the popular dictionary pair learning (DPL) into the scenario of twin-projective latent flexible DPL under a structured twin-incoherence. Technically, a novel framework called Twin-Projective Latent Flexible DPL (TP-DPL) is proposed, which minimizes the twin-incoherence constrained flexibly-relaxed reconstruction error to avoid the possible over-fitting issue and produce accurate reconstruction. In this setting, our TP-DPL integrates the twin-incoherence based latent flexible DPL and th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 39 publications
0
7
0
Order By: Relevance
“…Note that IRPCA, LatLRR, rLRR and I-LSPFC are unsupervised models and cannot classify new data directly, so we employ the same classifier training and classification process by Eqs. (38)(39) as our J-RFDL for them for the fair comparison. For KSVD, we similarly compute a classifier as the classifier training process separately as the LC-KSVD1 algorithm.…”
Section: A Baseline and Settingmentioning
confidence: 99%
See 1 more Smart Citation
“…Note that IRPCA, LatLRR, rLRR and I-LSPFC are unsupervised models and cannot classify new data directly, so we employ the same classifier training and classification process by Eqs. (38)(39) as our J-RFDL for them for the fair comparison. For KSVD, we similarly compute a classifier as the classifier training process separately as the LC-KSVD1 algorithm.…”
Section: A Baseline and Settingmentioning
confidence: 99%
“…ITH the increasing complexity of contents, diversity of distribution and high-dimensionality of real data, how to represent data efficiently for subsequent classification or clustering still remains an important research topic [1][2][3][9] [50]. To represent data, some feasible methods can be used, such as sparse representation (SR) by dictionary learning (DL) [4][5][6][7][8], low-rank coding [9][10][15] [38][39] and matrix factorization [11] [12], which are inspired by the fact that high-dimensional data can usually be characterized by applying a low-dimensional or compressed space in which the possible noise and redundant information can be removed in addition to preserving the useful information and important structures.…”
Section: Introductionmentioning
confidence: 99%
“…Eqn. (10) shows the objective function of our proposed classspecific sparse PCA algorithm for visual classification.…”
Section: A Class-specific Sparse Pca Algorithmmentioning
confidence: 99%
“…Different from FDDL, Gu et al [8] proposed a class-specific dictionary pair learning (DPL) to learn a synthesis dictionary and an analysis dictionary jointly. Several following works such as robust adaptive dictionary pair learning (RA-DPL) [9], twin-projective latent dictionary pairs learning (TP-DPL) [10] extend the regular dictionary learning to dictionary pair learning. This kind of approach achieves the goal of signal representation and discrimination by jointly learning a synthesis dictionary and an analysis dictionary.…”
Section: Introductionmentioning
confidence: 99%
“…However, CS theory states that if x j is sufficiently sparse in a transform (Ψ Ψ Ψ) domain, exact recovery is possible. Several recovery methods have been developed for this purpose [34], [35], [36], [37]. Specifically, if the transform coefficients, v j = Ψ Ψ Ψx j , are sufficiently sparse, the solution of the recovery procedure can be found with several l 0 optimization procedures or their l 1 -based convex relaxations that use pursuit-based methods [15], [16].…”
Section: Introductionmentioning
confidence: 99%