2020
DOI: 10.1109/tcsvt.2019.2911359
|View full text |Cite
|
Sign up to set email alerts
|

SCRATCH: A Scalable Discrete Matrix Factorization Hashing Framework for Cross-Modal Retrieval

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 96 publications
(41 citation statements)
references
References 52 publications
0
36
0
Order By: Relevance
“…Using the two datasets described above, we compared DTCH with eight cross-modal hashing methods that have been proposed in recent years: cross-view hashing (CVH) [ 13 ], intermedia hashing (IMH) [ 14 ], latent semantic sparse hashing (LSSH) [ 15 ], semantic correlation maximization (SCM) [ 16 ], discrete cross-modal hashing (DCH) [ 4 ], fast discrete cross-modal hashing (FDCH) [ 26 ], scalable discrete matrix factorization hashing (SCRATCH) [ 17 ], and two-step cross-modal hashing (TECH) [ 18 ]. Among these, CVH, IMH, and LSSH are unsupervised methods and the others are supervised methods.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Using the two datasets described above, we compared DTCH with eight cross-modal hashing methods that have been proposed in recent years: cross-view hashing (CVH) [ 13 ], intermedia hashing (IMH) [ 14 ], latent semantic sparse hashing (LSSH) [ 15 ], semantic correlation maximization (SCM) [ 16 ], discrete cross-modal hashing (DCH) [ 4 ], fast discrete cross-modal hashing (FDCH) [ 26 ], scalable discrete matrix factorization hashing (SCRATCH) [ 17 ], and two-step cross-modal hashing (TECH) [ 18 ]. Among these, CVH, IMH, and LSSH are unsupervised methods and the others are supervised methods.…”
Section: Methodsmentioning
confidence: 99%
“…However, SDH adopts the bitwise learning strategy to generate binary codes, making it time-consuming. Chen et al [ 17 ] proposed a scalable cross-modal hashing method in which matrix factorization is applied to the cross-modal field. Generally speaking, the retrieval accuracies of supervised learning methods are significantly higher than those of unsupervised methods owing to the exploitation of label information.…”
Section: Related Workmentioning
confidence: 99%
“…To evaluate the efficiency and effectiveness of SDDH, we compare SDDH with seven cross-modal hashing methods, in- cluding unsupervised method such as CMFH [2] and FSH [8], and supervised methods such as DCH [3], ADCH [12], GSPH [10], LCMFH [13] and SCRATCH [14]. For the proposed method, we empirically set λ = 5e − 4, γ = 1000, ξ = 0.001 and T = 6 for all three databases.…”
Section: Databases and Experimental Settingsmentioning
confidence: 99%
“…Wang et al [13] directly utilized the semantic labels for hash code classification. Moreover, recent researches [3,14] demonstrated that reducing the quantization errors by generating the hash codes under the discrete constraints is an effective way to facilitate the retrieval performance of supervised methods. For example, Shen et al [15] proposed a discrete cyclic coordinate descent method to learn binary codes bit by bit, which requires excessive computation in learning process.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, hashing methods can accelerate ANN search procedures and save on storage. Recently, hashing methods have been applied in the area of computer vision and machine learning [3][4][5][6].…”
Section: Introductionmentioning
confidence: 99%