2014 IEEE International Conference on Data Mining 2014
DOI: 10.1109/icdm.2014.29
|View full text |Cite
|
Sign up to set email alerts
|

Low-Rank Common Subspace for Multi-view Learning

Abstract: Multi-view data is very popular in real-world applications, as different view-points and various types of sensors help to better represent data when fused across views or modalities. Samples from different views of the same class are less similar than those with the same view but different class. We consider a more general case that prior view information of testing data is inaccessible in multi-view learning. Traditional multiview learning algorithms were designed to obtain multiple viewspecific linear projec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
90
0

Year Published

2017
2017
2018
2018

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 117 publications
(90 citation statements)
references
References 32 publications
(66 reference statements)
0
90
0
Order By: Relevance
“…By choosing an appropriate dictionary A, LRR can recover the underlying row space so as to reveal the true segmentation of data. Recently, low-rank representation has been incorporated with subspace learning, e.g., Low-rank Transfer Subspace Learning (LTSL) [22], Supervised Regularization based Robust Subspace (SRRS) [18], Low-Rank Common Subspace (LRCS) [23] and Low-Rank Discriminate Embedding (LRDE) [19], aiming to find a more robust subspace with low-rank constraint.…”
Section: Low-rank Representationmentioning
confidence: 99%
See 1 more Smart Citation
“…By choosing an appropriate dictionary A, LRR can recover the underlying row space so as to reveal the true segmentation of data. Recently, low-rank representation has been incorporated with subspace learning, e.g., Low-rank Transfer Subspace Learning (LTSL) [22], Supervised Regularization based Robust Subspace (SRRS) [18], Low-Rank Common Subspace (LRCS) [23] and Low-Rank Discriminate Embedding (LRDE) [19], aiming to find a more robust subspace with low-rank constraint.…”
Section: Low-rank Representationmentioning
confidence: 99%
“…To optimize the variables in Eq. (15), we use the Alternating Direction Method of Multiplier (ADMM) [29] since previous work [18] and [23] have demonstrated that ADMM works well in solving similar problems. By using ADMM, we can alternately update variables one by one with an iterative method.…”
Section: Problem Optimizationmentioning
confidence: 99%
“…We remove other irrelevant parts to D as: 24) where the rank minimization problem is replaced with minimizing the nuclear norm problem [66,135,78]. And problem (5.24) can be effectively solved by the singular value thresholding (SVT)…”
Section: Updating Dmentioning
confidence: 99%
“…To effectively capture the common information shared by them, we propose to adopt low-rank sparse decomposition to further decompose B andB [66,134,135,78]. As shown in Figure 5.1, we assume the two rating patterns B andB share a common patterns D and also preserve domain-specific patterns E andẼ [136].…”
Section: Low-rank Sparse Collective Factorization (Lscf)mentioning
confidence: 99%
See 1 more Smart Citation