A novel temperature sensor consisting of a single layer of metal (Ni, Pd, W, or Pt) is constructed. Its configuration challenges a long-established concept and may lead to development of a new category of devices. Reliable two-dimensional mapping of local temperatures is demonstrated using an array of these sensors. These single-metal thermocouples (SMTCs) can be readily applied on flexible substrates or at high temperatures.
Figure 1: Our robot-based, Poisson-guided autoscanner can progressively, adaptively, and fully automatically generate complete, high quality, and high fidelity scan models.
AbstractWe present a quality-driven, Poisson-guided autonomous scanning method. Unlike previous scan planning techniques, we do not aim to minimize the number of scans needed to cover the object's surface, but rather to ensure the high quality scanning of the model. This goal is achieved by placing the scanner at strategically selected Next-Best-Views (NBVs) to ensure progressively capturing the geometric details of the object, until both completeness and high fidelity are reached. The technique is based on the analysis of a Poisson field and its geometric relation with an input scan. We generate a confidence map that reflects the quality/fidelity of the estimated Poisson iso-surface. The confidence map guides the generation of a viewing vector field, which is then used for computing a set of NBVs. We applied the algorithm on two different robotic platforms, a PR2 mobile robot and a one-arm industry robot. We demonstrated the advantages of our method through a number of autonomous high quality scannings of complex physical objects, as well as performance comparisons against state-of-the-art methods.
In this paper a new technique is introduced for automatically building recognisable moving 3D models of individual people. A set of multi-view colour images of a person are captured from the front, side and back using one or more cameras. Model-based reconstruction of shape from silhouettes is used to transform a standard 3D generic humanoid model to approximate the persons shape and anatomical structure. Realistic appearance is achieved by colour texture mapping from the multi-view images. Results demonstrate the reconstruction of a realistic 3D facsimile of the person suitable for animation in a virtual world. The system is low-cost and is reliable for large variations in shape, size and clothing. This is the first approach to achieve realistic model capture for clothed people and automatic reconstruction of animated models. A commercial system based on this approach has recently been used to capture thousands of models of the general public.
In this paper a new technique is introduced for automatically building recognisable moving 3D models of individual people. A set of multi-view colour images of a person are captured from the front, side and back using one or more cameras. Model-based reconstruction of shape from silhouettes is used to transform a standard 3D generic humanoid model to approximate the persons shape and anatomical structure. Realistic appearance is achieved by colour texture mapping from the multi-view images. Results demonstrate the reconstruction of a realistic 3D facsimile of the person suitable for animation in a virtual world. The system is low-cost and is reliable for large variations in shape, size and clothing. This is the first approach to achieve realistic model capture for clothed people and automatic reconstruction of animated models. A commercial system based on this approach has recently been used to capture thousands of models of the general public.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.