2019
DOI: 10.48550/arxiv.1911.13270
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Transflow Learning: Repurposing Flow Models Without Retraining

Abstract: It is well known that deep generative models have a rich latent space, and that it is possible to smoothly manipulate their outputs by traversing this latent space. Recently, architectures have emerged that allow for more complex manipulations, such as making an image look as though it were from a different class, or painted in a certain style. These methods typically require large amounts of training in order to learn a single class of manipulations. We present Transflow Learning, a method for transforming a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 10 publications
(15 reference statements)
0
5
0
Order By: Relevance
“…There are many established transfer learning techniques in prediction tasks, such as classification and recommendation . Recently, a few studies have demonstrated transfer learning methods on generative tasks for metasurface inverse designs. , Here, we conducted transfer learning on normalizing flow models by fine-tuning model parameters trained from the source task. Specifically, we trained the model with the data set generated from a new emitter made from tungsten.…”
Section: Resultsmentioning
confidence: 99%
“…There are many established transfer learning techniques in prediction tasks, such as classification and recommendation . Recently, a few studies have demonstrated transfer learning methods on generative tasks for metasurface inverse designs. , Here, we conducted transfer learning on normalizing flow models by fine-tuning model parameters trained from the source task. Specifically, we trained the model with the data set generated from a new emitter made from tungsten.…”
Section: Resultsmentioning
confidence: 99%
“…The implementation of transflow-learning, which was suggested in [50], would allow the use of a trained normalizing flow on different, but similar problems without retraining the network. Such problems arise in HEP when new-physics effects from high energy scales modify scattering properties at low energies slightly and are described in an effective field theory framework.…”
Section: Discussionmentioning
confidence: 99%
“…Moreover, the multi-scale structure appears intrinsically in the distribution of natural images. Apart from generating whole images, the multi-scale representations can be used to perform other tasks, such as style transfer [47,48], content mixing [49][50][51], and texture synthesis [52][53][54][55][56].…”
Section: Related Workmentioning
confidence: 99%