2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00913
|View full text |Cite
|
Sign up to set email alerts
|

Photorealistic Style Transfer via Wavelet Transforms

Abstract: a) Inputs (c) PhotoWCT (d) Ours (WCT 2 ) (b) WCT Figure 1: Photorealistic stylization results. Given (a) an input pair (top: content, bottom: style), the results of (b) WCT [20], (c) PhotoWCT [21], and (d) our model are shown. Every result is produced without any post-processing. While WCT and PhotoWCT suffer from spatial distortions, our model successfully transfers the style and preserves the fine details. AbstractRecent style transfer models have provided promising artistic results. However, given a photogr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
239
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 332 publications
(259 citation statements)
references
References 33 publications
1
239
0
Order By: Relevance
“…To overcome this hurdle, Li et al 79 used wavelet transformation as well as multilevel stylization. Following this research, Yoo et al 80 devised the wavelet pooling layer to enable photorealistic style transfer.…”
Section: Image To Image Translation Without Using Generative Adversarmentioning
confidence: 99%
“…To overcome this hurdle, Li et al 79 used wavelet transformation as well as multilevel stylization. Following this research, Yoo et al 80 devised the wavelet pooling layer to enable photorealistic style transfer.…”
Section: Image To Image Translation Without Using Generative Adversarmentioning
confidence: 99%
“…Therefore, most of evaluations of neural style transfer models are qualitative [ 48 , 49 ]. The most common method is to qualitatively compare results of style transfer methods by putting stylized images side by side [ 11 , 12 , 13 , 14 , 21 , 24 , 29 ]. Besides showing stylized images, user study is also used for evaluation [ 9 , 14 , 29 ].…”
Section: Resultsmentioning
confidence: 99%
“…For example, Gatys et al [ 8 ] use the convolutional neural network (CNN) to reconstruct the content and style, and optimize the stylized image iteratively based on a loss function. From then on, CNN-based neural style transfer has become a hot topic [ 9 , 10 , 11 , 13 , 14 ].…”
Section: Introductionmentioning
confidence: 99%
“…In this section, we focus on the wavelet application of image synthesis, which is one of the most attractive issues in the community. Yoo et al (2019) propose a wavelet corrected transfer based on whitening and coloring transforms (WCT2) which encourages the transfer network to generate visually realistic images. Provenzi et al (2014) design a wavelet-based perceptually inspired color correction variational model to deal with both the color enhancement and color cast removal.…”
Section: Wavelet Domain Image Synthesismentioning
confidence: 99%