2021
DOI: 10.48550/arxiv.2110.02501
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Surrogate Gap between Contrastive and Supervised Losses

Abstract: Contrastive unsupervised representation learning (CURL) encourages data representation to make semantically similar pairs closer than randomly drawn negative samples, which has been successful in various domains such as vision, language, and graphs. Although recent theoretical studies have attempted to explain its success by upper bounds of a downstream classification loss by the contrastive loss, they are still not sharp enough to explain an experimental fact: larger negative samples improve the classificatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 12 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?