2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2022
DOI: 10.1109/cvpr52688.2022.01878
|View full text |Cite
|
Sign up to set email alerts
|

Forward Compatible Training for Large-Scale Embedding Retrieval Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 15 publications
0
6
0
Order By: Relevance
“…( 1) for the new model, as they suffer from the inherent trade-off between maximizing the new model performance and at the same time its compatibility to the weaker old model. Other works have thus proposed learning a separate mapping from old model embedding space to new model embedding space to align their features (Wang et al, 2020;Ramanujan et al, 2022). Wang et al (2020) and both still modify the training of the new model which is undesirable because it can limit the performance gain of the new model and is intrusive and complicated for practitioners.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…( 1) for the new model, as they suffer from the inherent trade-off between maximizing the new model performance and at the same time its compatibility to the weaker old model. Other works have thus proposed learning a separate mapping from old model embedding space to new model embedding space to align their features (Wang et al, 2020;Ramanujan et al, 2022). Wang et al (2020) and both still modify the training of the new model which is undesirable because it can limit the performance gain of the new model and is intrusive and complicated for practitioners.…”
Section: Methodsmentioning
confidence: 99%
“…This has led rise to the study of compatible representation learning. Shen et al (2020); Budnik & Avrithis (2021); ; Ramanujan et al (2022); Hu et al (2022); Zhao et al (2022); Duggal et al (2021) all proposed methods to update the embedding model to achieve better performance whilst still being compatible with features generated by the old model (see Figure 1-left-top). Despite relative success, compatibility learning is not perfect: performing retrieval with a mixture of old and new features achieves lower accuracies than when we replace all the old features with new ones.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Cross-model Compatible Transformation. Compatible feature transformation (Chen et al 2019;Wang et al 2020;Ramanujan et al 2022) also targets the compatible retrieval among different embedding models. However, it concerns training transformation modules to map the features from different models to a common space.…”
Section: Related Workmentioning
confidence: 99%
“…Without the task-identity, classifiers for different tasks fail to choose the corresponding features because these classifiers are not trained together. FACT [22] keeps an extra-embedding to capture irrelevant information from the old task yet may be useful for future tasks. It utilizes a linear layer to trans-form the extra-embedding and old feature into new feature.…”
Section: Introductionmentioning
confidence: 99%