2020
DOI: 10.48550/arxiv.2007.05840
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Representation Learning via Adversarially-Contrastive Optimal Transport

Abstract: In this paper, we study the problem of learning compact (low-dimensional) representations for sequential data that captures its implicit spatiotemporal cues. To maximize extraction of such informative cues from the data, we set the problem within the context of contrastive representation learning and to that end propose a novel objective via optimal transport. Specifically, our formulation seeks a low-dimensional subspace representation of the data that jointly (i) maximizes the distance of the data (embedded … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 29 publications
0
1
0
Order By: Relevance
“…Tian et al (2019; consider the image views in different modalities and minimize the irrelevant mutual information between them. Most of the works on negative selection observe the merits of using "hard" negative samples, motivating the introduction of additional techniques, such as Mixup and adversarial noise (Bose et al, 2018;Cherian & Aeron, 2020;Li et al, 2020). In a view that not all negative pairs are "true" negatives (Saunshi et al, 2019b), Chuang et al (2020) propose a decomposition of the data distribution to approximate the true negative distribution.…”
Section: Related Workmentioning
confidence: 99%
“…Tian et al (2019; consider the image views in different modalities and minimize the irrelevant mutual information between them. Most of the works on negative selection observe the merits of using "hard" negative samples, motivating the introduction of additional techniques, such as Mixup and adversarial noise (Bose et al, 2018;Cherian & Aeron, 2020;Li et al, 2020). In a view that not all negative pairs are "true" negatives (Saunshi et al, 2019b), Chuang et al (2020) propose a decomposition of the data distribution to approximate the true negative distribution.…”
Section: Related Workmentioning
confidence: 99%