2018 IEEE Symposium Series on Computational Intelligence (SSCI) 2018
DOI: 10.1109/ssci.2018.8628742
|View full text |Cite
|
Sign up to set email alerts
|

Improving Deep Learning with Generic Data Augmentation

Abstract: Deep artificial neural networks require a large corpus of training data in order to effectively learn, where collection of such training data is often expensive and laborious. Data augmentation overcomes this issue by artificially inflating the training set with label preserving transformations. Recently there has been extensive use of generic data augmentation to improve Convolutional Neural Network (CNN) task performance. This study benchmarks various popular data augmentation schemes to allow researchers to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
334
0
6

Year Published

2019
2019
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 523 publications
(342 citation statements)
references
References 6 publications
2
334
0
6
Order By: Relevance
“…We note that there are countless sketching protocols to represent molecules, and hence, future benchmarking studies will be needed to thoroughly examine their predictive signal. Similarly, elucidating the most convenient strategies to perform data augmentation is an area of intense research [99][100][101], also for chemical structure-activity modelling [82,102]. In the case of ConvNets, multiple representations of the same molecules generated using diverse sketching schemes (e.g., using diverse SMILES encoding rules [102]) might be implemented to perform data augmentation.…”
Section: Resultsmentioning
confidence: 99%
“…We note that there are countless sketching protocols to represent molecules, and hence, future benchmarking studies will be needed to thoroughly examine their predictive signal. Similarly, elucidating the most convenient strategies to perform data augmentation is an area of intense research [99][100][101], also for chemical structure-activity modelling [82,102]. In the case of ConvNets, multiple representations of the same molecules generated using diverse sketching schemes (e.g., using diverse SMILES encoding rules [102]) might be implemented to perform data augmentation.…”
Section: Resultsmentioning
confidence: 99%
“…Unfortunately, generating several realizations of the geological model with standard geostatistical algorithms may be very challenging. One possible solution we intend to investigate in the future is to use data augmentation (Yaeger et al, 1997;Taylor and Nitschke, 2017) and transfer learning techniques (Hoo-Chang et al, 2018;Cheng and Malhi, 2017). Data augmentation consist of a series of affine transformations applied to the input data to increase the training set.…”
Section: Commentsmentioning
confidence: 99%
“…1(c). Although both rotation and flip augmentation methods achieve similar accuracy improvements for image classification [24], [25], it is an open question about which one is preferred for radio modulation classification. After the Gaussian noise augmentation, the image is full of 'snow' and the received radio symbols are deviated as shown in Fig.…”
Section: Introductionmentioning
confidence: 99%