Proceedings of the 30th International Conference on Computer Graphics and Machine Vision (GraphiCon 2020). Part 2 2020
DOI: 10.51130/graphicon-2020-2-3-2
|View full text |Cite
|
Sign up to set email alerts
|

Depth-Aware Arbitrary Style Transfer Using Instance Normalization

Abstract: Style transfer is the process of rendering one image with some content in the style of another image, representing the style. Recent studies of Liu et al. (2017) show that traditional style transfer methods of Gatys et al. (2016) and Johnson et al.(2016) fail to reproduce the depth of the content image, which is critical for human perception. They suggest to preserve the depth map by additional regularizer in the optimized loss function, forcing preservation of the depth map. However these traditional methods … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 11 publications
0
2
0
Order By: Relevance
“…The paper [3] provides an overview of various methods for neural style transfer in computer graphics and computer vision. The review covers different approaches, including those based on convolutional neural networks (CNNs), generative adversarial networks (GANs), and patch-based methods.…”
Section: B Depth-aware Neural Style Transfer Using Instance Normaliza...mentioning
confidence: 99%
“…The paper [3] provides an overview of various methods for neural style transfer in computer graphics and computer vision. The review covers different approaches, including those based on convolutional neural networks (CNNs), generative adversarial networks (GANs), and patch-based methods.…”
Section: B Depth-aware Neural Style Transfer Using Instance Normaliza...mentioning
confidence: 99%
“…The depth map of the content image cannot be replicated. [22] proposed an extension to the AdaIn method to preserve the depth map by applying variable stylization strength. The comparison showed in the image Fig.…”
Section: F Depth Aware Style Transfermentioning
confidence: 99%