Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/617
|View full text |Cite
|
Sign up to set email alerts
|

Latent Distribution Preserving Deep Subspace Clustering

Abstract: Subspace clustering is a useful technique for many computer vision applications in which the intrinsic dimension of high-dimensional data is smaller than the ambient dimension. Traditional subspace clustering methods often rely on the self-expressiveness property, which has proven effective for linear subspace clustering. However, they perform unsatisfactorily on real data with complex nonlinear subspaces. More recently, deep autoencoder based subspace clustering methods have achieved success owning to the mor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
36
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 50 publications
(36 citation statements)
references
References 15 publications
0
36
0
Order By: Relevance
“…It is also noteworthy that performance of the DCFSC in this experiment was reported to be comparable with 26.6%, which was reported in the study [37] combining more sophisticated methodologies such as self-supervised learning with the DSC. Since DCFSC is easy to combine with the more advanced models [41,37,40,39] of DSC, there is possibility of further enhancing the performance of these modified models with deeper neural architecture. 3…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…It is also noteworthy that performance of the DCFSC in this experiment was reported to be comparable with 26.6%, which was reported in the study [37] combining more sophisticated methodologies such as self-supervised learning with the DSC. Since DCFSC is easy to combine with the more advanced models [41,37,40,39] of DSC, there is possibility of further enhancing the performance of these modified models with deeper neural architecture. 3…”
Section: Resultsmentioning
confidence: 99%
“…The biggest contribution of [12] was that they firstly designed the self-expressive layer and corresponding loss function which models self-expressiveness property of data into deep auto-encoder. Since DSC showed great performance on various benchmarks, there have been many subsequent studies [41,37,40,39] that tried to improve the DSC in several aspects. Deep adversarial subspace clustering [41] exploited GAN-like adversarial learning framework to supervise sample representation learning and subspace clustering.…”
Section: Related Workmentioning
confidence: 99%
“…2) Subspace Clustering in Feature Space: This type of methods [17]- [19], [35] map the raw samples into feature space to better capture the non-linear characteristics of sample distribution, and then construct the affinity matrix in the feature space. For instance, kernel subspace clustering [17] is proposed to implicitly map the HSI samples from original space to a kernelized space.…”
Section: B Subspace Clusteringmentioning
confidence: 99%
“…SSC [8], [9] or LRR [21]). Deep learning methods have been proposed to extend this principle to the case of non-linear subspaces [22], [14], [27].…”
Section: Reminder On the Key Notionsmentioning
confidence: 99%