Appearance reproduction is an important aspect of 3D printing. Current color reproduction systems use halftoning methods that create colors through a spatial combination of different inks at the object's surface. This introduces a variety of artifacts to the object, especially when viewed from a closer distance. In this work, we propose an alternative color reproduction method for 3D printing. Inspired by the inherent ability of 3D printers to layer different materials on top of each other, 3D color contoning creates colors by combining inks with various thicknesses inside the object's volume. Since inks are inside the volume, our technique results in a uniform color surface with virtually invisible spatial patterns on the surface. For color prediction, we introduce a simple and highly accurate spectral model that relies on a weighted regression of spectral absorptions. We fully characterize the proposed framework by addressing a number of problems, such as material arrangement, calculation of ink concentration, and 3D dot gain. We use a custom 3D printer to fabricate and validate our results.
is critically limited to color reproduction on planar surfaces, to arbitrary 3D shapes. Our method enables high-fidelity color texture reproduction on 3D prints by effectively compensating for internal light scattering within arbitrarily shaped objects. In addition, we propose a content-aware gamut mapping that significantly improves color reproduction for the pathological case of thin geometric features. Using a wide range of sample objects with complex textures and geometries, we demonstrate color reproduction whose fidelity is superior to state-of-the-art drivers for color 3D printers. CCS Concepts: • Computing methodologies → Reflectance modeling; Volumetric models; • Applied computing → Computer-aided manufacturing.
We propose a workflow for spectral reproduction of paintings, which captures a painting's spectral color, invariant to illumination, and reproduces it using multi-material 3D printing. We take advantage of the current 3D printers' capabilities of combining highly concentrated inks with a large number of layers, to expand the spectral gamut of a set of inks. We use a data-driven method to both predict the spectrum of a printed ink stack and optimize for the stack layout that best matches a target spectrum. This bidirectional mapping is modeled using a pair of neural networks, which are optimized through a problem-specific multi-objective loss function. Our loss function helps find the best possible ink layout resulting in the balance between spectral reproduction and colorimetric accuracy under a multitude of illuminants. In addition, we introduce a novel spectral vector error diffusion algorithm based on combining color contoning and halftoning, which simultaneously solves the layout discretization and color quantization problems, accurately and efficiently. Our workflow outperforms the state-of-the-art models for spectral prediction and layout optimization. We demonstrate reproduction of a number of real paintings and historically important pigments using our prototype implementation that uses 10 custom inks with varying spectra and a resin-based 3D printer.
We introduce a novel ink selection method for spectral printing. The ink selection algorithm takes a spectral image and a set of inks as input, and selects a subset of those inks that results in optimal spectral reproduction. We put forward an optimization formulation that searches a huge combinatorial space based on mixed integer programming. We show that solving this optimization in the conventional reflectance space is intractable. The main insight of this work is to solve our problem in the spectral absorbance space with a linearized formulation. The proposed ink selection copes with large-size problems for which previous methods are hopeless. We demonstrate the effectiveness of our method in a concrete setting by lifelike reproduction of handmade paintings. For a successful spectral reproduction of high-resolution paintings, we explore their spectral absorbance estimation, efficient coreset representation, and accurate data-driven reproduction.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.