2018
DOI: 10.1109/tip.2018.2848470
|View full text |Cite
|
Sign up to set email alerts
|

Structured AutoEncoders for Subspace Clustering

Abstract: Existing subspace clustering methods typically employ shallow models to estimate underlying subspaces of unlabeled data points and cluster them into corresponding groups. However, due to the limited representative capacity of the employed shallow models, those methods may fail in handling realistic data without the linear subspace structure. To address this issue, we propose a novel subspace clustering approach by introducing a new deep model-Structured AutoEncoder (StructAE). The StructAE learns a set of expl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
135
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 323 publications
(136 citation statements)
references
References 39 publications
0
135
0
Order By: Relevance
“…Several works [26,25,4] proposed a type of the methodology that representations learned by auto-encoder were forced to follow a specific conventional prior structure related with self-expression, e.g., Sparse Subspace Clustering (SSC) [7] and Low-rank Representation (LRR) [16]. [15] proposed deep-encoder based row space recovery methodology to make conventional low-rank subspace clustering scalable and fast.…”
Section: Related Workmentioning
confidence: 99%
“…Several works [26,25,4] proposed a type of the methodology that representations learned by auto-encoder were forced to follow a specific conventional prior structure related with self-expression, e.g., Sparse Subspace Clustering (SSC) [7] and Low-rank Representation (LRR) [16]. [15] proposed deep-encoder based row space recovery methodology to make conventional low-rank subspace clustering scalable and fast.…”
Section: Related Workmentioning
confidence: 99%
“…It provides an explicit non-linear mapping for the complex input data that is well-adapted to the subspace clustering, which yields significant improvement over the state-of-the-art subspace clustering solutions. A structured autoencoder in [34] introduces a global structure prior into the non-linear mapping. These deep subspace approaches mainly focus on the clustering or recognition problems, in which the network weights are well trained to exploit the similar information among the input data, instead of the differential information used for change detection task.…”
Section: Deep Subspace Clusteringmentioning
confidence: 99%
“…As the popularity of neural networks in recent years [6], [9], [35], [47], we equip the LtDaHP scheme with an FNN-instance to show its outperformance. Taking inner weights as minimal Riesz energy points on a sphere and thresholds as equally spaced points (ESPs) in an interval, we can define an FNNrealization of LtDaHP.…”
Section: Introductionmentioning
confidence: 99%