2019
DOI: 10.1111/cgf.13621
|View full text |Cite
|
Sign up to set email alerts
|

StyleBlit: Fast Example‐Based Stylization with Local Guidance

Abstract: We present StyleBlit—an efficient example‐based style transfer algorithm that can deliver high‐quality stylized renderings in real‐time on a single‐core CPU. Our technique is especially suitable for style transfer applications that use local guidance ‐ descriptive guiding channels containing large spatial variations. Local guidance encourages transfer of content from the source exemplar to the target image in a semantically meaningful way. Typical local guidance includes, e.g., normal values, texture coordinat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(10 citation statements)
references
References 52 publications
0
10
0
Order By: Relevance
“…However, they require the content and style image to be semantically similar. Stylize-by-example approaches also adopt a patch-based approach using high-quality examples as guidance [Fišer et al 2016;Sýkora et al 2019b;]. Despite impressive results, these methods require dedicated guiding examples provided by professional artists.…”
Section: Related Work 21 Image and Video Style Transfermentioning
confidence: 99%
See 1 more Smart Citation
“…However, they require the content and style image to be semantically similar. Stylize-by-example approaches also adopt a patch-based approach using high-quality examples as guidance [Fišer et al 2016;Sýkora et al 2019b;]. Despite impressive results, these methods require dedicated guiding examples provided by professional artists.…”
Section: Related Work 21 Image and Video Style Transfermentioning
confidence: 99%
“…We propose to combine NeRF and image-based neural style transfer to perform 3D scene stylization. While NeRF provides a strong inductive bias to maintain multi-view consistency, neural style transfer enables a flexible stylization approach that does not require dedicated example inputs from professional artists [Fišer et al 2016;Sýkora et al 2019a;]. Additionally, we address the memory limitations of NeRF by splitting the 3D scene style transfer process into two steps that run alternatingly.…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, neural style transfer [GEB16] and neural texture synthesis [GEB15] have been used in a variety of contexts (e.g. sketching [TTK∗21,SJT∗19], video [JvST∗19, TFK∗20], painting style [TFF∗20]). These methods are based on the matching of the statistics extracted by a pre‐trained neural network between output and target images.…”
Section: Related Workmentioning
confidence: 99%
“…Some have explored generating new texture from a smaller example texture through classical texture synthesis methods [3,8,40] and neural approaches [12,13,31,33,49]. Textures have also been manipulated through lighting-based style transfer [11,35] rather than through a purely image-based approach like the one we present in Sec. 6.4.…”
Section: Spherical Images Superpixels and Texture Mapsmentioning
confidence: 99%