Proceedings of the 6th International Conference on Pattern Recognition Applications and Methods 2017
DOI: 10.5220/0006205002170224
|View full text |Cite
|
Sign up to set email alerts
|

Compression Techniques for Deep Fisher Vectors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…Fisher kernels derived from DBMs [14] to achieve large‐scale visual classification task. We have observed that the standard practice of utilising Fisher kernels derived from deep neural models [14, 16, 19] has the following limitations: (i) results in large dimensional FVs that require more memory usage and disk space for classifier training, and (ii) prone to overfitting under limited data availability applications due to the large number of gradient parameters derived from deep neural models. Besides these shortcomings in deep FVs, there is also a growing need for deploying computationally efficient and compact features in resource constrained environments or applications such as mobile devices, smart wearables, autonomous vehicles and so on.…”
Section: Introduction and Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Fisher kernels derived from DBMs [14] to achieve large‐scale visual classification task. We have observed that the standard practice of utilising Fisher kernels derived from deep neural models [14, 16, 19] has the following limitations: (i) results in large dimensional FVs that require more memory usage and disk space for classifier training, and (ii) prone to overfitting under limited data availability applications due to the large number of gradient parameters derived from deep neural models. Besides these shortcomings in deep FVs, there is also a growing need for deploying computationally efficient and compact features in resource constrained environments or applications such as mobile devices, smart wearables, autonomous vehicles and so on.…”
Section: Introduction and Related Workmentioning
confidence: 99%
“…FVs for image classification have been improved in several different ways by introducing class‐relevant information through foreground Fisher kernels [20] and by utilising diagonal Fisher information matrix to approximate Fisher kernels [21, 22]. To address the problem of high dimensionality in FVs and find compact representations, several compression and feature selection methods have been evaluated before [16, 23]. The techniques, selection as well as compression, seek to reduce the dimensions of the data descriptors; however compression methods implement the technique by projecting the data in a different parameter space, whereas the feature selection methods carefully choose the data attributes without changing them and hence retain the original data space.…”
Section: Introduction and Related Workmentioning
confidence: 99%
See 2 more Smart Citations