Surface roughness is one of the most important qualities in haptic perception. Roughness is a major identifier for judgments of material composition, comfort, and friction and is tied closely to manual dexterity. Some attention has been given to the study of roughness perception in the past, but it has typically focused on noncontrollable natural materials or on a narrow range of artificial materials. The advent of high-resolution three-dimensional (3D) printing technology provides the ability to fabricate arbitrary 3D textures with precise surface geometry to be used in tactile studies. We used parametric modeling and 3D printing to manufacture a set of textured plates with defined element spacing, shape, and arrangement. Using active touch and two-alternative forced-choice protocols, we investigated the contributions of these surface parameters to roughness perception in human subjects. Results indicate that large spatial periods produce higher estimations of roughness (with Weber fraction = 0.19), small texture elements are perceived as rougher than large texture elements of the same wavelength, perceptual differences exist between textures with the same spacing but different arrangements, and roughness equivalencies exist between textures differing along different parameters. We posit that papillary ridges serve as tactile processing units, and neural ensembles encode the spatial profiles of the texture contact area to produce roughness estimates. The stimuli and the manufacturing process may be used in further studies of tactile roughness perception and in related neurophysiological applications. NEW & NOTEWORTHY Surface roughness is an integral quality of texture perception. We manufactured textures using high-resolution 3D printing, which allows precise specification of the surface spatial topography. In human psychophysical experiments we investigated the contributions of specific surface parameters to roughness perception. We found that textures with large spatial periods, small texture elements, and irregular, isotropic arrangements elicit the highest estimations of roughness. We propose that roughness correlates inversely with the total contacted surface area.
Objects with various types of mechanical joints are among the most commonly built. Joints implement a vocabulary of simple constrained motions (kinematic pairs) that can be used to build more complex behaviors. Defining physically correct joint geometry is crucial both for realistic appearance of models during motion, as these are typically the only parts of geometry that stay in contact, and for fabrication. Direct design of joint geometry often requires more effort than the design of the rest of the object geometry, as it requires design of components that stay in precise contact, are aligned with other parts, and allow the desired range of motion. We present an interactive system for creating physically realizable joints with user‐controlled appearance. Our system minimizes or, in most cases, completely eliminates the need for the user to manipulate low‐level geometry of joints. This is achieved by automatically inferring a small number of plausible combinations of joint dimensions, placement and orientation from part geometry, with the user making the final high‐level selection based on object semantic. Through user studies, we demonstrate that functional results with a satisfying appearance can be obtained quickly by users with minimal modeling experience, offering a significant improvement in the time required for joint construction, compared to standard modeling approaches.
Everyone uses the sense of touch to explore the world, and roughness is one of the most important qualities in tactile perception. Roughness is a major identifier for judgments of material composition, comfort, and friction, and it is tied closely to manual dexterity. The advent of high-resolution 3D printing technology provides the ability to fabricate arbitrary 3D textures with surface geometry that confers haptic properties. In this work, we address the problem of mapping object geometry to tactile roughness. We fabricate a set of carefully designed stimuli and use them in experiments with human subjects to build a perceptual space for roughness. We then match this space to a quantitative model obtained from strain fields derived from elasticity simulations of the human skin contacting the texture geometry, drawing from past research in neuroscience and psychophysics. We demonstrate how this model can be applied to predict and alter surface roughness, and we show several applications in the context of fabrication.
Textures are encountered often on various common objects and surfaces. Many textures combine visual and tactile aspects, each serving important purposes; most obviously, a texture alters the object's appearance or tactile feeling as well as serving for visual or tactile identification and improving usability. The tactile feel and visual appearance of objects are often linked, but they may interact in unpredictable ways. Advances in high-resolution 3D printing enable highly flexible control of geometry to permit manipulation of both visual appearance and tactile properties. In this paper, we propose an optimization method to independently control the tactile properties and visual appearance of a texture. Our optimization is enabled by neural network-based models, and allows the creation of textures with a desired tactile feeling while preserving a desired visual appearance at a relatively low computational cost, for use in a variety of applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.