This paper presents models for graphite pencil, drawing paper, blenders, and kneaded eraser that produce realistic looking pencil marks, textures, and tones. Our models are based on an observation of how lead pencils interact with drawing paper, and on the absorptive and dispersive properties of blenders and erasers interacting with lead material deposited over drawing paper. The models consider parameters such as the particle composition of the lead, the texture of the paper, the position and shape of the pencil materials, and the pressure applied to them. We demonstrate the capabilities of our approach with a variety of images and compare them to digitized pencil drawings. We also present image‐based rendering results implementing traditional graphite pencil tone rendering methods.
Researchers in non‐photorealistic rendering have investigated the display of three‐dimensional worlds using various display models. In particular, recent work has focused on the modeling of traditional artistic media and styles such as pen‐and‐ink illustration and watercolor painting. By providing 3D rendering systems that use these alternative display models users can generate traditional illustration renderings of their three‐dimensional worlds. In this paper we present our graphite pencil 3D renderer. We have broken the problem of simulating pencil drawing down into four fundamental parts: (1) simulating the drawing materials (graphite pencil and drawing paper, blenders and kneaded eraser), (2) modeling the drawing primitives (individual pencil strokes and mark‐making to create tones and textures), (3) simulating the basic rendering techniques used by artists and illustrators familiar with pencil rendering, and (4) modeling the control of the drawing composition. Each part builds upon the others and is essential to developing the framework for higher‐level rendering methods and tools. In this paper we present parts 2, 3, and 4 of our research. We present non‐photorealistic graphite pencil rendering methods for outlining and shading. We also present the control of drawing steps from preparatory sketches to finished rendering results. We demonstrate the capabilities of our approach with a variety of images generated from 3D models.
We present an approach to use evolutionary learning of behavior to improve testing of commercial computer games. After identifying unwanted results or behavior of the game, we propose to develop measures on how near a sequence of game states comes to the unwanted behavior and to use these measures within the fitness function of a GA working on action sequences. This allows to find action sequences that produce the unwanted behavior, if they exist. Our experimental evaluation of the method with the FIFA-99 game and scoring a goal as unwanted behavior shows that the method is able to find such action sequences, allowing for an easy reproduction of critical situations and improvements to the tested game.
Half‐toning is the process by which gray‐scale images are approximated with sets of black and white pixels. The process works because our eyes perceive a local average, thus half‐toning seeks to approximate the local average. Ideally this approximation should be accomplished without introducing ‘undesirable’ artifacts. In many situations the stylized display of images is desired. Often this stylized display is accomplished by the addition of semi‐structured artifacts. In current applications the designer processes the image using tools provided by some image processing package. The resulting image is then half‐toned and printed. Half‐toning these processed images can reduce the visual impact of the special effects that have been introduced in the image. In this paper we show that the processes of controlled artifact introduction and half‐toning can successfully be combined. By combining these two processes we ensure that the printed image is what the designer intended. We present a brief overview of the current error‐diffusion half‐toning techniques. We then propose several ways in which artifacts can be introduced to the image. This discussion is accompanied by a set of illustrative images. In particular, we discuss the introduction of false edges and the alteration of the scan pattern. We illustrate these techniques with a variety of images. We conclude the paper with a discussion on these new half‐toning methods for the generation of binary gray‐scale textures. In addition to showing how to generate these binary gray‐scale textures we also show how these gray‐scale textures can be used to half‐tone images.
The display of images on binary output hardware requires a halftoning step. Conventional halftoning algorithms approximate image values independently from the image content and often introduce artificial texture that obscures fine details. The objective of this research is to adapt a halftoning technique to 3D scene information and thus to enhance the display of computer generated 3D scenes. Our approach is based on the control of halftoning texture by the combination of ordered dithering and error diffusion techniques. We extend our previous work and enable a user to specify the shape, scale, direction, and contrast of the halftoning texture using an external buffer. We control texture shape by constructing a dither matrix from an arbitrary image or a procedural texture. Texture direction and scale are adapted to the external information by the mapping function. Texture contrast and the accuracy of tone reproduction are varied across the image using the error diffusion process. We halftone images of 3D scenes by using the geometry, position, and illumination information to control the halftoning texture. Thus, the texture provides visual cues and can be used to enhance the viewer's comprehension of the display.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.