Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/510
|View full text |Cite
|
Sign up to set email alerts
|

Spectral Perturbation Meets Incomplete Multi-view Data

Abstract: Beyond existing multi-view clustering, this paper studies a more realistic clustering scenario, referred to as incomplete multi-view clustering, where a number of data instances are missing in certain views. To tackle this problem, we explore spectral perturbation theory. In this work, we show a strong link between perturbation risk bounds and incomplete multi-view clustering. That is, as the similarity matrix fed into spectral clustering is a quantity bounded in magnitude O(1), we transfer the missing problem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
26
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 82 publications
(36 citation statements)
references
References 4 publications
0
26
0
Order By: Relevance
“…We compare IMVC‐SBD with recent published incomplete multi‐view clustering baselines. Incomplete Multi‐view Clustering via Graph Regularised Matrix Factorisation (IMC‐GRMF) [54] attempts to learn a common representation according to the matrix factorisation and exploits the locality structure via a nearest neighbour graph. One‐Pass Incomplete Multi‐view Clustering (OPIMC) [53] also belongs to the matrix factorisation method, which reduces the computation and storage complexity for large‐scale incomplete multi‐view data. Unified Embedding Alignment with Missing Views (UEAF) [55] introduces a locality‐preserved reconstruction term to infer the missing views, and develops a reverse graph to guarantee the consensus of local structure of multiple views. Spectral Perturbation Meets Incomplete Multi‐view Data (PIC) [58] learns a consensus Laplacian matrix from the incomplete multi‐view data according to the perturbation theory. …”
Section: Methodsmentioning
confidence: 99%
“…We compare IMVC‐SBD with recent published incomplete multi‐view clustering baselines. Incomplete Multi‐view Clustering via Graph Regularised Matrix Factorisation (IMC‐GRMF) [54] attempts to learn a common representation according to the matrix factorisation and exploits the locality structure via a nearest neighbour graph. One‐Pass Incomplete Multi‐view Clustering (OPIMC) [53] also belongs to the matrix factorisation method, which reduces the computation and storage complexity for large‐scale incomplete multi‐view data. Unified Embedding Alignment with Missing Views (UEAF) [55] introduces a locality‐preserved reconstruction term to infer the missing views, and develops a reverse graph to guarantee the consensus of local structure of multiple views. Spectral Perturbation Meets Incomplete Multi‐view Data (PIC) [58] learns a consensus Laplacian matrix from the incomplete multi‐view data according to the perturbation theory. …”
Section: Methodsmentioning
confidence: 99%
“…In the experiments, we compare the proposed MSCIG with six state-of-the-art methods: Multiple Incomplete views Clustering (MIC) [14], Multi-view Learning with Incomplete Views (MVL-IV) [16], Incomplete Multi-modality Grouping (IMG) [9], Doubly Aligned Incomplete Multi-view Clustering (DAIMC) [15], Incomplete Multiple Kernel K-means Algorithm with Mutual Kernel Completion [19] (IMKK-MKC) and Perturbation-oriented Incomplete multiview Clustering (PIC) [23]. Since IMG is originally designed for incomplete two-view data, we extend it by [22], and thus the extended version can deal with data with arbitrary incomplete views.…”
Section: ) Baselines and Experimental Environmentmentioning
confidence: 99%
“…The work in [22] performs representation learning and clustering simultaneously with multiple incomplete similarity matrices. The approach proposed in [23] first fills in the missing entries of similarity matrices with the average of corresponding certain entries, and then learns views weights to combine a common graph matrix.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…These methods commonly impose some preconstructed view-specific diagonal matrices on the matrix factorization terms of all views to reduce the negative influence of missing views. In addition to these matrix factorization based IMC methods, multiple kernels based clustering methods [15-17, 21, 22, 37] and graph learning based methods [24,26,27,41] are also extended to the IMC cases. Multiple kernels based IMC methods commonly seek to recover the rows and columns of kernels corresponding to those absent views and then learn the consensus representation of all views from these recovered kernels.…”
Section: Introductionmentioning
confidence: 99%