2019
DOI: 10.1109/access.2019.2920712
|View full text |Cite
|
Sign up to set email alerts
|

Asymmetric Deep Semantic Quantization for Image Retrieval

Abstract: Due to its fast retrieval and storage efficiency capabilities, hashing has been widely used in nearest neighbor retrieval tasks. By using deep learning based techniques, hashing can outperform non-learning based hashing technique in many applications. However, we argue that the current deep learning based hashing methods ignore some critical problems (e.g., the learned hash codes are not discriminative due to the hashing methods being unable to discover rich semantic information and the training strategy havin… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 35 publications
0
5
0
Order By: Relevance
“…To solve this limitation, some methods have been proposed to further enrich the semantic information of hash codes in addition to direct semantic supervision, DSEH and AD-SQ [38,39] utilizes self-supervised networks to capture rich pairwise semantic information to guide feature learning network from both semantic level and hash codes level, thus enriching semantic information and pairwise correlation in hash codes. In comparison to [38,39] our work improves the guidance mechanism of self-supervised network on feature learning network to be less time-consuming and more semantic-targeted with asymmetric learning strategy. Moreover, the generated semantic codes are further exploited to identify scalable margins for pairwise contrastive-form constraint for higher-level pairwise correlations and highly-discriminative hash code representations.…”
Section: Deep Hashing Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To solve this limitation, some methods have been proposed to further enrich the semantic information of hash codes in addition to direct semantic supervision, DSEH and AD-SQ [38,39] utilizes self-supervised networks to capture rich pairwise semantic information to guide feature learning network from both semantic level and hash codes level, thus enriching semantic information and pairwise correlation in hash codes. In comparison to [38,39] our work improves the guidance mechanism of self-supervised network on feature learning network to be less time-consuming and more semantic-targeted with asymmetric learning strategy. Moreover, the generated semantic codes are further exploited to identify scalable margins for pairwise contrastive-form constraint for higher-level pairwise correlations and highly-discriminative hash code representations.…”
Section: Deep Hashing Methodsmentioning
confidence: 99%
“…Asymmetric hashing methods [3,19,39,40] have recently becoming an eye-catching research focus. In ADSH [19], query points and database points are treated asymmetrically, with only query points being engaged in the stage of updating deep network parameters, while the hash codes for database are directly learned as independent parameters, the hash cod-es generated by query and database are correlated through asymmetric pairwise constraints, such that the dataset points can be efficiently utilized during hashing function learning procedure.…”
Section: Asymmetric Hashing Methodsmentioning
confidence: 99%
“…x < 0.5 (7) where 𝑆𝑆 𝑖𝑖 is the hash-like code obtained by passing the input image i through the hash layer. To minimize the quantization error arising during the conversion of the true feature representation into a binary hash code [42]- [44], we introduce a quantization loss function to bring the binary code closer to the desired hash code:…”
Section: B Hash Learningmentioning
confidence: 99%
“…Asymmetric Deep Semantic Quantization [127] increases the performance by utilizing two ImgNets in minimizing the gap between the real-continuous features and the discrete binary codes, and the difference loss is also added.…”
Section: =mentioning
confidence: 99%