2023
DOI: 10.1109/tmm.2023.3245400
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Marginalized Semantic Hashing for Unpaired Cross-Modal Retrieval

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(4 citation statements)
references
References 51 publications
0
4
0
Order By: Relevance
“…Considering large-scale retrieval applications and unequal hash length encoding scenarios, discrete asymmetric hashing (DAH) (Zhang et al 2023b) proposes a flexible framework. To enhance the discrimination of hash codes, adaptive marginalized semantic hashing (AMSH) (Luo et al 2023) proposes the adaptive margin matrices to alleviate the rigid zero-one linear regression. However, these methods unconsciously ignore the influence of noise and outliers in the hashing learning process.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Considering large-scale retrieval applications and unequal hash length encoding scenarios, discrete asymmetric hashing (DAH) (Zhang et al 2023b) proposes a flexible framework. To enhance the discrimination of hash codes, adaptive marginalized semantic hashing (AMSH) (Luo et al 2023) proposes the adaptive margin matrices to alleviate the rigid zero-one linear regression. However, these methods unconsciously ignore the influence of noise and outliers in the hashing learning process.…”
Section: Related Workmentioning
confidence: 99%
“…To verify the performance of our method, we compare DSCMH with thirteen state-of-the-art cross-modal hashing methods, including RFDH (Wang, Wang, and Gao 2017), LCMFH (Wang et al 2018), DLFH (Jiang and Li 2019), MTFH (Liu et al 2019), FCMH (Wang et al 2021b), FDDH (Liu, Wang, and Cheung 2021), BATCH (Wang et al 2021c), EDMH (Chen et al 2022), DAH (Zhang et al 2023b), ALECH (Li et al 2023), WASH (Zhang et al 2023a), and AMSH (Luo et al 2023).…”
Section: Baselines and Implementationmentioning
confidence: 99%
“…1) Incomplete Cross-modal Retrieval: Efforts in addressing the challenge of imbalanced or incomplete cross-modal retrieval focus on mitigating the issue of unbalanced or unpaired multi-modal samples in training data through the utilization of either shallow or deep learning techniques. In the shallow learning methods [166]- [168], typically, Robust Unsupervised Cross-Modal Hashing (RUCMH) [167] maps multiple heterogeneous spaces to a common semantic space for object reconstruction, without relying on paired data. Advancements in deep learning lead to Triplet Fusion Network Hashing (TFNH) [220], which utilizes a triplet network with zero padding and data classifiers to ensure effective learning for unpaired data.…”
Section: E Cross-modal Retrieval Under Special Retrieval Scenariomentioning
confidence: 99%
“…Representation learning can effectively deal with this problem. In such methods, the aim is to learn a function that can transform different modalities into a common feature space [4,5], where we can compare them directly. Due to the quick expansion of the data scale and the decline of data retrieval efficiency, the hashing codes are applied to cross-modal retrieval tasks [6][7][8].…”
Section: Introductionmentioning
confidence: 99%