2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.00306
|View full text |Cite
|
Sign up to set email alerts
|

DistillHash: Unsupervised Deep Hashing by Distilling Data Pairs

Abstract: Due to the high storage and search efficiency, hashing has become prevalent for large-scale similarity search. Particularly, deep hashing methods have greatly improved the search performance under supervised scenarios. In contrast, unsupervised deep hashing models can hardly achieve satisfactory performance due to the lack of reliable supervisory similarity signals. To address this issue, we propose a novel deep unsupervised hashing model, dubbed Distill-Hash, which can learn a distilled data set consisted of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
68
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 127 publications
(71 citation statements)
references
References 55 publications
0
68
0
Order By: Relevance
“…Then, SSDH constructed a similarity matrix and designed a pairwise loss function to preserve this semantic information. DistillHash [24] treated the initial similarity relationship as noisy labels and learned a distilled data set to perform unsupervised hash codes learning.…”
Section: Related Work a Deep Hashing Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Then, SSDH constructed a similarity matrix and designed a pairwise loss function to preserve this semantic information. DistillHash [24] treated the initial similarity relationship as noisy labels and learned a distilled data set to perform unsupervised hash codes learning.…”
Section: Related Work a Deep Hashing Methodsmentioning
confidence: 99%
“…Similar to [22], [24], we aim to decrease the noisy relationship in unsupervised deep hashing for better retrieval accuracy. However, different from existing unsupervised methods which use the overall information of the images to estimate the similarity matrix, we propose multi-part corresponding FIGURE 2.…”
Section: Related Work a Deep Hashing Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In [72], the proposed unsupervised hashing framework unifies the quantization error minimization, likelihood and mutual information maximization to preserve the feature distribution for better code quality. In DistillHash [66], Bayesian learning framework is integrated into the hash code learning, where a distilled data set is investigated automatically and further utilized to learn the compact binary code. In [51], the proposed DVB adopts a conditional auto-encoding variational Bayesian networks to estimate the training data structure under the probabilistic inference process with hashing objectives, thus improving the code quality.…”
Section: B Learning-based Feature Descriptorsmentioning
confidence: 99%
“…Cross-modal similarity retrieval has been a popular research topic [14,18,20,25,28,28,32,33,37,40] with the objective to search the semantic similar instances from different modalities. In a typical scenario, instances in one modality, e.g., images, are retrieved given a query from another modality, e.g., text.…”
Section: Introductionmentioning
confidence: 99%