2019
DOI: 10.48550/arxiv.1911.09785
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ReMixMatch: Semi-Supervised Learning with Distribution Alignment and Augmentation Anchoring

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
287
0
3

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 138 publications
(291 citation statements)
references
References 0 publications
1
287
0
3
Order By: Relevance
“…(c) Other methods include adversarial training with GANs [74], [75], [76], [77], [78], [79], [80], entropy minimization [11], and hybrid methods [81], [82], [83] which use a combination of above methods along with additional regularization terms.…”
Section: Related Workmentioning
confidence: 99%
“…(c) Other methods include adversarial training with GANs [74], [75], [76], [77], [78], [79], [80], entropy minimization [11], and hybrid methods [81], [82], [83] which use a combination of above methods along with additional regularization terms.…”
Section: Related Workmentioning
confidence: 99%
“…Semi-supervised learning (SSL) [29], [31], [33], [42]- [44] leverages unlabeled data to improve a model's performance when limited labeled data is provided, which alleviates the expensive labeling process efficiently. Some recently proposed semi-supervised learning methods, such as MixMatch [43], FixMatch [31], and ReMixMatch [42], are based on augmentation viewpoints. MixMatch [43] uses low-entropy labels for data-augmented unlabeled instances and mixes labeled and unlabeled data for semi-supervised learning.…”
Section: Semi-supervised Learningmentioning
confidence: 99%
“…Fix-Match [31] generates pseudo labels using the model's predictions on weakly augmented unlabeled images. ReMix-Match [42] uses a weakly-augmented example to generate an artificial label and enforce consistency against stronglyaugmented examples. Semi-supervised domain adaptation has more information about some target labels compared with UDA, and some related works [45]- [49] have been proposed leveraging semi-supervised signals.…”
Section: Semi-supervised Learningmentioning
confidence: 99%
“…Its performance is comparable to that of supervised methods. Moreover, the augmentation idea in self-supervised learning is also adopted in some semi-supervised learning methods [24], [54], where augmentation anchoring is applied to replace the consistency regularization as it shows a stronger generalization ability.…”
Section: Self-supervised Learningmentioning
confidence: 99%