2018
DOI: 10.1109/tcsvt.2017.2710345
|View full text |Cite
|
Sign up to set email alerts
|

Scalable Discrete Supervised Multimedia Hash Learning With Clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
8
2

Relationship

2
8

Authors

Journals

citations
Cited by 23 publications
(12 citation statements)
references
References 35 publications
0
12
0
Order By: Relevance
“…User Relevance Feedback has been used since the 1960s to improve queries for information retrieval [24] and saw a boom in the 1990s and 2000s for multimedia retrieval [11,12,29,[33][34][35]. Later, it started fading as hash based approaches [20], product quantization [13], and deep learning models [7,32,40] were proven more efficient for retrieval on large-scale collections. However, in recent years the issue of scalability has largely been resolved and the state-of-the-art URF systems for large scale multimedia retrieval are competitive with other approaches and require fewer examples to train their models than the supervised approaches [16,18,36].…”
Section: Related Workmentioning
confidence: 99%
“…User Relevance Feedback has been used since the 1960s to improve queries for information retrieval [24] and saw a boom in the 1990s and 2000s for multimedia retrieval [11,12,29,[33][34][35]. Later, it started fading as hash based approaches [20], product quantization [13], and deep learning models [7,32,40] were proven more efficient for retrieval on large-scale collections. However, in recent years the issue of scalability has largely been resolved and the state-of-the-art URF systems for large scale multimedia retrieval are competitive with other approaches and require fewer examples to train their models than the supervised approaches [16,18,36].…”
Section: Related Workmentioning
confidence: 99%
“…We compare our SCDH method with recent state-of-the-art deep hashing methods, including pairwise based methods such as DSH [31], DHN [64], DPSH [28], DQN [4], DISH [63], DSDH [27], triplet based methods like NINH [22], FTDE [65], BOH [6], DRLIH [61], and unary loss based methods like CNNBH [13], SSDH [59]. They follow similar experimental settings, but different methods may use different deep networks, thus we train on several types of network (AlexNet, VGGNet, ResNet, etc.)…”
Section: B Comparison On Supervised Hashingmentioning
confidence: 99%
“…Hashing methods can map similar data to similar binary codes that have small Hamming distance, due to the low storage cost and fast retrieval speed, hashing methods have been receiving broad attention [1]. Hashing methods have also been used in many applications, such as image retrieval [2]- [10] and video retrieval [11]- [13]. Traditional hashing methods [6], [7], [14], [15] take pre-extracted image features as input, and then learn hash functions by exploiting the data structures or applying the similarity preserving regularizations.…”
Section: Introductionmentioning
confidence: 99%