2023
DOI: 10.1109/access.2022.3233104
|View full text |Cite
|
Sign up to set email alerts
|

Self-Supervised Feature Enhancement: Applying Internal Pretext Task to Supervised Learning

Abstract: Traditional self-supervised learning requires convolutional neural networks (CNNs) using external pretext tasks (i.e., image-or video-based tasks) to encode high-level semantic visual representations. In this paper, we show that feature transformations within CNNs can also be regarded as supervisory signals to construct the self-supervised task, called internal pretext task. And such a task can be applied for the enhancement of supervised learning. Specifically, we first transform the internal feature maps by … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 30 publications
0
0
0
Order By: Relevance