DOI: 10.1007/978-3-540-87479-9_27
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised Laplacian Regularization of Kernel Canonical Correlation Analysis

Abstract: Kernel canonical correlation analysis (KCCA) is a dimensionality reduction technique for paired data. By finding directions that maximize correlation, KCCA learns representations that are more closely tied to the underlying semantics of the data rather than noise. However, meaningful directions are not only those that have high correlation to another modality, but also those that capture the manifold structure of the data. We propose a method that is simultaneously able to find highly correlated directions tha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
40
0

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 50 publications
(41 citation statements)
references
References 14 publications
(29 reference statements)
0
40
0
Order By: Relevance
“…The above problem is a kind of temporally regularized orthogonal CCA.Temporal regularisation is probably the reason that the proposed approach outperforms CTW (which does not employ any temporal regularisation). Even though Laplacian regularization of component analysis techniques has recently been significantly studied [7], Laplacian regularization for CCA models has not received much attention [5]. To the best of our knowledge, this is the first component analysis methodology which can 1 …”
Section: Theoretical Interpretation 251 Relationship To Canonical Cmentioning
confidence: 99%
“…The above problem is a kind of temporally regularized orthogonal CCA.Temporal regularisation is probably the reason that the proposed approach outperforms CTW (which does not employ any temporal regularisation). Even though Laplacian regularization of component analysis techniques has recently been significantly studied [7], Laplacian regularization for CCA models has not received much attention [5]. To the best of our knowledge, this is the first component analysis methodology which can 1 …”
Section: Theoretical Interpretation 251 Relationship To Canonical Cmentioning
confidence: 99%
“…3) Graph-based: some recent advances adopt Gaussian Processes [21,17] or Markov Random Walks [2]. Transduction by Laplacian graph [4,10] is also shown to be able to solve multi-class semi-supervised problems; although these algorithms make use of the relationship between unlabeled and labeled data, their computational complexity is demanding, e.g. of O(n 3 ).…”
Section: Related Workmentioning
confidence: 99%
“…In another work, Blaschko et al [Blaschko and Lampert 2008] use CCA for clustering images using the associated text as a second view. Both of these works assume that the views are complete, unlike the setting we considered in this paper.…”
Section: Related Workmentioning
confidence: 99%
“…For example, one can take a probabilistic approach to CCA [Rai and Daumé III 2009] and treat the missing tags for non-tagged webpages as latent variables. In the non-probabilistic setting, one can use the semi-supervised variants of CCA [Blaschko et al 2008;Kim and Pavlovic 2009] which do not require full information from both the views. Alternatively, a somewhat similar way of accomplishing this would be to write a combined eigenvalue problem with one part of it being CCA on the tagged webpages, and the other being LSA on the non-tagged webpages.…”
Section: Other Ways To Deal With Partially Tagged Corpusmentioning
confidence: 99%