2020
DOI: 10.48550/arxiv.2003.00134
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Image Hashing by Minimizing Discrete Component-wise Wasserstein Distance

Abstract: Image hashing is a fundamental problem in the computer vision domain with various challenges, primarily, in terms of efficiency and effectiveness. Existing hashing methods lack a principled characterization of the goodness of the hash codes and a principled approach to learn the discrete hash functions that are being optimized in the continuous space. Adversarial autoencoders are shown to be able to implicitly learn a robust hash function that generates hash codes which are balanced and have low-quantization e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 1 publication
0
2
0
Order By: Relevance
“…As shown in Figure 1, the models generate CIFAR-10 images (higher-dimensional) with significant mode collapse or white noise than the generated MNIST images (low-dimensional). Minimizing OT on the original input space only works in domains where the input data is already low-dimensional and the Frobenius distance is suitable [20,9]. Another challenge of using OT is its high computational cost.…”
Section: Primal Wasserstein Distancesmentioning
confidence: 99%
See 1 more Smart Citation
“…As shown in Figure 1, the models generate CIFAR-10 images (higher-dimensional) with significant mode collapse or white noise than the generated MNIST images (low-dimensional). Minimizing OT on the original input space only works in domains where the input data is already low-dimensional and the Frobenius distance is suitable [20,9]. Another challenge of using OT is its high computational cost.…”
Section: Primal Wasserstein Distancesmentioning
confidence: 99%
“…Solving the linear-sum assignment program has a complexity of O(N 2.5 log N ) where N is the number of samples [5]. Reducing OT's computational cost is necessary to utilize it in practice [9].…”
Section: Primal Wasserstein Distancesmentioning
confidence: 99%