2021
DOI: 10.3390/sym13081497
|View full text |Cite
|
Sign up to set email alerts
|

StyleGANs and Transfer Learning for Generating Synthetic Images in Industrial Applications

Abstract: Deep learning applications on computer vision involve the use of large-volume and representative data to obtain state-of-the-art results due to the massive number of parameters to optimise in deep models. However, data are limited with asymmetric distributions in industrial applications due to rare cases, legal restrictions, and high image-acquisition costs. Data augmentation based on deep learning generative adversarial networks, such as StyleGAN, has arisen as a way to create training data with symmetric dis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(2 citation statements)
references
References 52 publications
0
2
0
Order By: Relevance
“…1473 acneic face images) as input data. Transfer learning is widely used in deep learning because it allows training models from small datasets, decreasing the training time and the processing power [ 25 ].…”
Section: Methodsmentioning
confidence: 99%
“…1473 acneic face images) as input data. Transfer learning is widely used in deep learning because it allows training models from small datasets, decreasing the training time and the processing power [ 25 ].…”
Section: Methodsmentioning
confidence: 99%
“…The reliance on a large number of target domain data for constructing target learners can thus be reduced. It has emerged as a popular and promising area of machine learning due to its wide range of application possibilities, especially in solving real-world problems [ 78 , 79 , 80 , 81 ] in a cheaper and more reliable method. The sphere of use of transfer learning is not few, coupled with its record of high results [ 82 , 83 ].…”
Section: Methodsmentioning
confidence: 99%