2018
DOI: 10.1016/j.patcog.2018.05.003
|View full text |Cite
|
Sign up to set email alerts
|

Data-independent Random Projections from the feature-space of the homogeneous polynomial kernel

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
23
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1
1

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(23 citation statements)
references
References 29 publications
0
23
0
Order By: Relevance
“…However, they discarded this idea because directly applying RP to the bilinear descriptors would involve storing a large projection matrix and explicitly computing the bilinear descriptors in the first place. However, recent advances in the intersection of kernel methods and Random Projection [20,[29][30][31] have made it possible to efficiently perform Random Projections from the feature spaces of different kernel functions in an efficient manner. In particular, an efficient method to approximate a Random Projection for polynomial kernels was introduced in [20] .…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…However, they discarded this idea because directly applying RP to the bilinear descriptors would involve storing a large projection matrix and explicitly computing the bilinear descriptors in the first place. However, recent advances in the intersection of kernel methods and Random Projection [20,[29][30][31] have made it possible to efficiently perform Random Projections from the feature spaces of different kernel functions in an efficient manner. In particular, an efficient method to approximate a Random Projection for polynomial kernels was introduced in [20] .…”
Section: Related Workmentioning
confidence: 99%
“…However, recent advances in the intersection of kernel methods and Random Projection [20,[29][30][31] have made it possible to efficiently perform Random Projections from the feature spaces of different kernel functions in an efficient manner. In particular, an efficient method to approximate a Random Projection for polynomial kernels was introduced in [20] . This paper adapts the ideas presented in [20] to make bilinear CNNs less computationally demanding by approximating a Random Projection of the bilinear descriptor.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Existing kernels are either handcrafted (such as gaussian, histogram intersection, etc. [4,47,76,22,70]) or trained using multiple kernels [41,42,43,44,45] and explicit kernel maps [34,35,36,38,39,40] as well as their deep variants 1 [48,49,50,51,53,54,55,37,17].…”
Section: Introductionmentioning
confidence: 99%
“…In this proposed framework, the learned support vectors act as kernel parameters and make it possible to map input data to multiple kernel features prior to their classification (as also achieved in [34,35,36,38,39,40]). Nevertheless, the proposed framework is conceptually different from these related methods; the latter consider the support vectors fixed/taken from training data in order to design explicit kernel maps prior to learn parametric SVMs while in our method, the support vectors are optimized to better fit the task at hand.…”
Section: Introductionmentioning
confidence: 99%