2023
DOI: 10.1587/transinf.2023edp7008
|View full text |Cite
|
Sign up to set email alerts
|

On Gradient Descent Training Under Data Augmentation with On-Line Noisy Copies

Katsuyuki HAGIWARA

Abstract: In machine learning, data augmentation (DA) is a technique for improving the generalization performance of models. In this paper, we mainly consider gradient descent of linear regression under DA using noisy copies of datasets, in which noise is injected into inputs. We analyze the situation where noisy copies are newly generated and injected into inputs at each epoch, i.e., the case of using on-line noisy copies. Therefore, this article can also be viewed as an analysis on a method using noise injection into … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 17 publications
(25 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?