2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2017
DOI: 10.1109/cvpr.2017.740
|View full text |Cite
|
Sign up to set email alerts
|

Deep Photo Style Transfer

Abstract: Figure 1: Given a reference style image (a) and an input image (b), we seek to create an output image of the same scene as the input, but with the style of the reference image. The Neural Style algorithm [5] (c) successfully transfers colors, but also introduces distortions that make the output look like a painting, which is undesirable in the context of photo style transfer. In comparison, our result (d) transfers the color of the reference style image equally well while preserving the photorealism of the out… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
603
2
1

Year Published

2017
2017
2021
2021

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 615 publications
(636 citation statements)
references
References 20 publications
(63 reference statements)
4
603
2
1
Order By: Relevance
“…Despite being a research topic for a long time [RHW88], the interest in neural network algorithms is a relatively new phenomenon, triggered by seminal works such as ImageNet [KSH12]. In computer graphics, such approaches have led to impressive results, e.g., for synthesizing novel viewpoints of natural scenes [FNPS16], to generate photorealistic face textures [SWH * 16], and to robustly transfer image styles between photographs [LPSB17], to name just a few examples. The underlying optimization approximates an unknown function f * (x) = y, by minimizing an associated loss function L such that f (x, θ) ≈ y.…”
Section: Related Work and Backgroundmentioning
confidence: 99%
“…Despite being a research topic for a long time [RHW88], the interest in neural network algorithms is a relatively new phenomenon, triggered by seminal works such as ImageNet [KSH12]. In computer graphics, such approaches have led to impressive results, e.g., for synthesizing novel viewpoints of natural scenes [FNPS16], to generate photorealistic face textures [SWH * 16], and to robustly transfer image styles between photographs [LPSB17], to name just a few examples. The underlying optimization approximates an unknown function f * (x) = y, by minimizing an associated loss function L such that f (x, θ) ≈ y.…”
Section: Related Work and Backgroundmentioning
confidence: 99%
“…Gatys et al [GEB*17] add the possibility for users to guide the transfer with annotations. In the context of photographic transfer, Luan et al [LPSB17] limit mismatches using scene analysis. Li and Wand [LW16a] use nearest‐neighbor correspondences between neural responses to make the transfer content‐aware.…”
Section: Introductionmentioning
confidence: 99%
“…In contrast, our result has lighting similar to that of the target image, and people and objects in shadow still have their colors. We show more comparison results with Shih et al [16] and deep photo style transfer [6] in Fig. S4 and Fig.…”
Section: Comparisonsmentioning
confidence: 99%
“…Li et al [5] recolor images using geodesic distance based on harmonization. More recently, Luan et al [6] propose a deep learning approach for photographic style transfer. These methods produce visually pleasing recolored images but cannot change local lighting.…”
Section: Color Transfer and Correctionmentioning
confidence: 99%