a) Input image (b) Labeled cluster c (c) Sub-cluster of c (e) New texture with multi-scale content control (d) Pattern palette Figure 1: The input image (a) is a non-stationary texture. Our multi-scale analysis provides a hierarchy of labeled clusters: one cluster at a coarse scale (b) includes sub-clusters at finer scales (c). It has several applications in texture synthesis, such as automatic pattern palettes for interactive texture editing (d) and content selection for creating new non-stationary textures (e). AbstractTexture synthesis is a well-established area, with many important applications in computer graphics and vision. However, despite their success, synthesis techniques are not used widely in practice because the creation of good exemplars remains challenging and extremely tedious. In this paper, we introduce an unsupervised method for analyzing texture content across multiple scales that automatically extracts good exemplars from natural images. Unlike existing methods, which require extensive manual tuning, our
Reconstruction of a simplified meshLocal dynamic reconstruction refinement Figure 1. Our reconstruction framework, illustrated on the TRIPLE HECATE model. Starting from a dense input point set, we reconstruct a simplified mesh (center). Benefiting from the connectivity of this initial reconstruction, we can make it to evolve dynamically so as to refine the approximation locally. This refinement can be achieved either in an automatic fashion, for example in order to improve the quality of the elements of the mesh, or interactively, in order to add or remove sample points. Here, the draped dress has been locally enhanced (right). AbstractIn this paper, we introduce a flexible framework for the reconstruction of a surface from an unorganized point set, extending the geometric convection approach introduced by Chaine [9]. Given a dense input point cloud, we first extract a triangulated surface that interpolates a subset of the initial data. We compute this surface in an output sensitive manner by decimating the input point set on-the-fly during the reconstruction process. Our simplification procedure relies on a simple criterion that locally detects and reduces oversampling. If needed, we then operate in a dynamic fashion for local refinement or further simplification of the reconstructed surface. Our method allows to locally update the reconstructed surface by inserting or removing sample points without restarting the convection process from scratch. This iterative correction process can be controlled interactively by the user or automatized given some specific local sampling constraints.
We introduce a novel semi‐procedural approach that avoids drawbacks of procedural textures and leverages advantages of data‐driven texture synthesis. We split synthesis in two parts: 1) structure synthesis, based on a procedural parametric model and 2) color details synthesis, being data‐driven. The procedural model consists of a generic Point Process Texture Basis Function (PPTBF), which extends sparse convolution noises by defining rich convolution kernels. They consist of a window function multiplied with a correlated statistical mixture of Gabor functions, both designed to encapsulate a large span of common spatial stochastic structures, including cells, cracks, grains, scratches, spots, stains, and waves. Parameters can be prescribed automatically by supplying binary structure exemplars. As for noise‐based Gaussian textures, the PPTBF is used as stand‐alone function, avoiding classification tasks that occur when handling multiple procedural assets. Because the PPTBF is based on a single set of parameters it allows for continuous transitions between different visual structures and an easy control over its visual characteristics. Color is consistently synthesized from the exemplar using a multiscale parallel texture synthesis by numbers, constrained by the PPTBF. The generated textures are parametric, infinite and avoid repetition. The data‐driven part is automatic and guarantees strong visual resemblance with inputs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.