CVPR 2011 2011
DOI: 10.1109/cvpr.2011.5995740
|View full text |Cite
|
Sign up to set email alerts
|

Heterogeneous image feature integration via multi-modal spectral clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
138
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 208 publications
(139 citation statements)
references
References 16 publications
1
138
0
Order By: Relevance
“…Many previous works have studied to design an SM with high quality, such as [Cai et al, 2005]. Later, [Nie et al, 2014;Chen and Dy, 2016; proposed an ideal SM S ∈ R n×n , and introduced the CLR method .…”
Section: Graph-based Clustering Descriptionmentioning
confidence: 99%
See 2 more Smart Citations
“…Many previous works have studied to design an SM with high quality, such as [Cai et al, 2005]. Later, [Nie et al, 2014;Chen and Dy, 2016; proposed an ideal SM S ∈ R n×n , and introduced the CLR method .…”
Section: Graph-based Clustering Descriptionmentioning
confidence: 99%
“…The summarization of these datasets is shown in Table 1. The proposed RAMC is compared with the methods: Co-regularized spectral clustering [Kumar et al, 2011] (Co-reg), Multi-View Spectral Clustering [Cai et al, 2011] (MVSC), Robust Multi-view Spectral Clustering [Xia et al, 2014] (RMSC), Parameterweighted Multi-view Clustering (PwMC) , Self-weighted Multi-view Clustering (SwMC) . The result of CLR is also compared as a baseline in the experiment.…”
Section: Clustering Effects On Different Real Datasetsmentioning
confidence: 99%
See 1 more Smart Citation
“…MVL seeks to employ multiple "independent" clues (e.g., bi-lingual information, different modalities); MKL, however, combines multiple base kernels to create a "unified" kernel for learning, where these kernels are not necessarily independent like the "views" in MVL. Similarly, both were originally studied for (semi-)supervised learning, and have been extended to unsupervised setting recently [33,6,14,15,17,16]. Multiview clustering [20,34,15,16], as an extension of MVL, assumes inherently that the views are uncorrelated.…”
Section: Related Workmentioning
confidence: 99%
“…Finally, they showed that their proposed semi-supervised multi-view learning is with a substantial improvement on the classification performance than existing methods. In transfer multi-view learning, the literatures (e.g., [4,38]) leveraged the consistency of the views and considered the domain difference among the views to learn heterogenous data.…”
Section: Multi-view Learningmentioning
confidence: 99%