2013 IEEE Conference on Computer Vision and Pattern Recognition 2013
DOI: 10.1109/cvpr.2013.233
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Learning for Extrinsic Classification of Manifold Features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
78
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 85 publications
(78 citation statements)
references
References 24 publications
0
78
0
Order By: Relevance
“…For example, the set of all reflectance functions produced by Lambertian objects lies in a linear subspace (Basri and Jacobs 2003;Ramamoorthi 2002). Several state-of-the-art methods for matching videos or image sets model given data by subspaces (Hamm and Lee 2008;Harandi et al 2011;Turaga et al 2011;Vemulapalli et al 2013;Sanderson et al 2012;Chen et al 2013). Auto regressive and moving average models, which are typically employed to model dynamics in spatio-temporal processing, can also be expressed by linear subspaces (Turaga et al 2011).…”
Section: Introductionmentioning
confidence: 98%
“…For example, the set of all reflectance functions produced by Lambertian objects lies in a linear subspace (Basri and Jacobs 2003;Ramamoorthi 2002). Several state-of-the-art methods for matching videos or image sets model given data by subspaces (Hamm and Lee 2008;Harandi et al 2011;Turaga et al 2011;Vemulapalli et al 2013;Sanderson et al 2012;Chen et al 2013). Auto regressive and moving average models, which are typically employed to model dynamics in spatio-temporal processing, can also be expressed by linear subspaces (Turaga et al 2011).…”
Section: Introductionmentioning
confidence: 98%
“…Nevertheless, to choose an appropriate kernel could be challenging. In fact, several works have been devoted to address this [76,148]. When the kernel is poorly chosen it leads to inferior performance [148].…”
Section: Classification On Riemannian Manifoldsmentioning
confidence: 99%
“…Another popular school of thought is to map manifold points onto a Reproducing Kernel Hilbert Space (RKHS) and apply kernel-based classifiers [61,70,76,148]. As Euclidean geometry applies in the RKHS, this can be thought of a decoupling between machine learning and data representation [81].…”
Section: Classification On Riemannian Manifoldsmentioning
confidence: 99%
See 2 more Smart Citations