One of the key ingredients of any physically based rendering system is a detailed specification characterizing the interaction of light and matter of all materials present in a scene, typically via the Bidirectional Reflectance Distribution Function (BRDF). Despite their utility, access to real-world BRDF datasets remains limited: this is because measurements involve scanning a four-dimensional domain at sufficient resolution, a tedious and often infeasibly time-consuming process. We propose a new parameterization that automatically adapts to the behavior of a material, warping the underlying 4D domain so that most of the volume maps to regions where the BRDF takes on non-negligible values, while irrelevant regions are strongly compressed. This adaptation only requires a brief 1D or 2D measurement of the material's retro-reflective properties. Our parameterization is unified in the sense that it combines several steps that previously required intermediate data conversions: the same mapping can simultaneously be used for BRDF acquisition, storage, and it supports efficient Monte Carlo sample generation. We observe that the above desiderata are satisfied by a core operation present in modern rendering systems, which maps uniform variates to direction samples that are proportional to an analytic BRDF. Based on this insight, we define our adaptive parameterization as an invertible, retro-reflectively driven mapping between the parametric and directional domains. We are able to create noise-free renderings of existing BRDF datasets after conversion into our representation with the added benefit that the warped data is significantly more compact, requiring 16KiB and 544KiB per spectral channel for isotropic and anisotropic specimens, respectively. Finally, we show how to modify an existing gonio-photometer to provide the needed retro-reflection measurements. Acquisition then proceeds within a 4D space that is warped by our parameterization. We demonstrate the efficacy of this scheme by acquiring the first set of spectral BRDFs of surfaces exhibiting arbitrary roughness, including anisotropy.
Figure 1: Top-left: rendering a voxelized forest at decreasing levels of detail (left to right). Bottom-right: visualization of the voxel structure at the matching resolutions. We use the SGGX microflake distribution to represent volumetric anisotropic materials. Our representation supports downscaling and interpolation, resulting in smooth and antialiased renderings at multiple scales. AbstractWe introduce the Symmetric GGX (SGGX) distribution to represent spatially-varying properties of anisotropic microflake participating media. Our key theoretical insight is to represent a microflake distribution by the projected area of the microflakes. We use the projected area to parameterize the shape of an ellipsoid, from which we recover a distribution of normals. The representation based on the projected area allows for robust linear interpolation and prefiltering, and thanks to its geometric interpretation, we derive closed form expressions for all operations used in the microflake framework.We also incorporate microflakes with diffuse reflectance in our theoretical framework. This allows us to model the appearance of rough diffuse materials in addition to rough specular materials. Finally, we use the idea of sampling the distribution of visible normals to design a perfect importance sampling technique for our SGGX microflake phase functions. It is analytic, deterministic, simple to implement, and one order of magnitude faster than previous work.
Figure 1: A high-quality animated production model (Ptex T-rex model c Walt Disney Animation Studios.) rendered in real time under directional and environment lighting using LEADR mapping on an NVidia GTX 480 GPU. The surface appearance is preserved at all scales, using a single shading sample per pixel. Combined with adaptive GPU tessellation, our method provides the fastest, seamless, and antialiased progressive representation for displaced surfaces. AbstractWe present Linear Efficient Antialiased Displacement and Reflectance (LEADR) mapping, a reflectance filtering technique for displacement mapped surfaces. Similarly to LEAN mapping, it employs two mipmapped texture maps, which store the first two moments of the displacement gradients. During rendering, the projection of this data over a pixel is used to compute a noncentered anisotropic Beckmann distribution using only simple, linear filtering operations. The distribution is then injected in a new, physically based, rough surface microfacet BRDF model, that includes masking and shadowing effects for both diffuse and specular reflection under directional, point, and environment lighting. Furthermore, our method is compatible with animation and deformation, making it extremely general and flexible. Combined with an adaptive meshing scheme, LEADR mapping provides the very first seamless and hardware-accelerated multi-resolution representation for surfaces. In order to demonstrate its effectiveness, we render highly detailed production models in real time on a commodity GPU, with quality matching supersampled ground-truth images.
In this paper, we show that applying a linear transformation---represented by a 3 x 3 matrix---to the direction vectors of a spherical distribution yields another spherical distribution, for which we derive a closed-form expression. With this idea, we can use any spherical distribution as a base shape to create a new family of spherical distributions with parametric roughness, elliptic anisotropy and skewness. If the original distribution has an analytic expression, normalization, integration over spherical polygons, and importance sampling, then these properties are inherited by the linearly transformed distributions. By choosing a clamped cosine for the original distribution we obtain a family of distributions, which we call Linearly Transformed Cosines (LTCs), that provide a good approximation to physically based BRDFs and that can be analytically integrated over arbitrary spherical polygons. We show how to use these properties in a realtime polygonal-light shading application. Our technique is robust, fast, accurate and simple to implement.
Real-time: analytic spherical-cap integration Joint MIS (ours) 64spp MIS (previous) 64spp O ine: joint BRDF/spherical-cap sampling Fig. 1. Real-time application (le ). We approximate BRDFs with our distributions and shade with sphere lights in real-time using their analytic spherical-cap integral. The scene is rendered at 1080p and runs at 60fps on an NVIDIA 980 GTX. O line application (right). We compute the reference image with the exact BRDFs using importance sampling techniques. We use our distribution as a proxy for the BRDF and generate be er samples that are distributed jointly inside the lights and close to the BRDFs. This joint sampling scheme is unbiased and has lower variance than multiple importance sampling with separate BRDF and light sampling.We introduce a novel parameterization for spherical distributions that is based on a point located inside the sphere, which we call a pivot. The pivot serves as the center of a straight-line projection that maps solid angles onto the opposite side of the sphere. By transforming spherical distributions in this way, we derive novel parametric spherical distributions that can be evaluated and importance-sampled from the original distributions using simple, closedform expressions. Moreover, we prove that if the original distribution can be sampled and/or integrated over a spherical cap, then so can the transformed distribution. We exploit the properties of our parameterization to derive e cient spherical lighting techniques for both real-time and o ine rendering. Our techniques are robust, fast, easy to implement, and achieve quality that is superior to previous work.
Physically based shading is transforming the way we approach production rendering, and simplifying the lives of artists in the process. By adhering to physically based, energy-conserving models, one can easily create realistic materials that maintain their properties under a variety of lighting conditions. In contrast, traditional ad hoc models have required extensive tweaking to achieve the same result. Building upon previous incarnations of the course, we present further research and practical advice on the subject, from film and game production.
We introduce a novel fitting procedure that takes as input an arbitrary material, possibly anisotropic, and automatically converts it to a microfacet BRDF. Our algorithm is based on the property that the distribution of microfacets may be retrieved by solving an eigenvector problem that is built solely from backscattering samples. We show that the eigenvector associated to the largest eigenvalue is always the only solution to this problem, and compute it using the power iteration method. This approach is straightforward to implement, much faster to compute, and considerably more robust than solutions based on nonlinear optimizations. In addition, we provide simple conversion procedures of our fits into both Beckmann and GGX roughness parameters, and discuss the advantages of microfacet slope space to make our fits editable. We apply our method to measured materials from two large databases that include anisotropic materials, and demonstrate the benefits of spatially varying roughness on texture mapped geometric models.
We derive a halfedge refinement rule for Catmull-Clark subdivision. The rule is illustrated on the left: Catmull-Clark subdivision splits each halfedge into exactly 4 new ones independently from the face within which the subdivision operates (see highlighted halfedges). We leverage this rule in a novel GPU implementation that runs at state-of-the-art performances. For instance, the control mesh of this illustrated T-Rex production model consists of ∼11.5k faces and vertices. We compute its subdivision down to level 4, which produces ∼2.9M faces and vertices, in less than three milliseconds on an NVIDIA RTX 2080 GPU.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.