2015
DOI: 10.1016/j.neucom.2014.08.106
|View full text |Cite
|
Sign up to set email alerts
|

Multi-view multi-sparsity kernel reconstruction for multi-class image classification

Abstract: This paper addresses the problem of multi-class image classification by proposing a novel multi-view multi-sparsity kernel reconstruction (MMKR for short) model. Given images (including test images and training images) representing with multiple visual features, the MMKR first maps them into a high-dimensional space, e.g., a reproducing kernel Hilbert space (RKHS), where test images are then linearly reconstructed by some representative training images, rather than all of them. Furthermore a classification rul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(2 citation statements)
references
References 36 publications
0
2
0
Order By: Relevance
“…To process multi-view multi-label data sets, corresponding learning methods including multi-view multi-sparsity kernel reconstruction (MMKR for short) model [13] and multi-feature fusion based on supervised multi-view multilabel canonical correlation projection (sM2CP) [14] are developed. While they have a common issue, namely, how to process data sets with incomplete information, multigranularity label correlation when label-specific features and complementarity information provided.…”
Section: Common Issuementioning
confidence: 99%
“…To process multi-view multi-label data sets, corresponding learning methods including multi-view multi-sparsity kernel reconstruction (MMKR for short) model [13] and multi-feature fusion based on supervised multi-view multilabel canonical correlation projection (sM2CP) [14] are developed. While they have a common issue, namely, how to process data sets with incomplete information, multigranularity label correlation when label-specific features and complementarity information provided.…”
Section: Common Issuementioning
confidence: 99%
“…∈ {−1,1} To enable the coefficients of data in the same space to be highly correlated, we apply the low rank constraint to capture the global structure of the whole data. In addition, the low-rank structure can relieve the impact from noises, and makes regression more accurate [37,38]. In order to consider the low-rank structure of W, we need to make:…”
Section: Low-rank Hypergraph Hashingmentioning
confidence: 99%