2023
DOI: 10.48550/arxiv.2302.01384
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Energy-Inspired Self-Supervised Pretraining for Vision Models

Abstract: Motivated by the fact that forward and backward passes of a deep network naturally form symmetric mappings between input and output representations, we introduce a simple yet effective self-supervised vision model pretraining framework inspired by energy-based models (EBMs). In the proposed framework, we model energy estimation and data restoration as the forward and backward passes of a single network without any auxiliary components, e.g., an extra decoder. For the forward pass, we fit a network to an energy… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 29 publications
(57 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?