We present a digital signal processing technique that reduces the speckle content in reconstructed digital holograms. The method is based on sequential sampling of the discrete Fourier transform of the reconstructed image field. Speckle reduction is achieved at the expense of a reduced intensity and resolution, but this tradeoff is shown to be greatly superior to that imposed by the traditional mean and median filtering techniques. In particular, we show that the speckle can be reduced by half with no loss of resolution (according to standard definitions of both metrics).
We present a method for real-time bare hand tracking that utilizes natural hand synergies to reduce the complexity and improve the plausibility of the hand posture estimation. The hand pose and posture are estimated by fitting a virtual hand model to the 3D point cloud obtained from a Kinect camera using an inverse kinematics approach. We use real human hand movements captured with a Vicon motion tracking system as the ground truth for deriving natural hand synergies based on principal component analysis. These synergies are integrated in the tracking scheme by optimizing the posture in a reduced parameter space. Tracking in this reduced space combined with joint limit avoidance constrains the posture estimation to natural hand articulations. The information loss associated with dimension reduction can be dealt with by employing a hierarchical optimization scheme. We show that our synergistic hand tracking approach improves runtime performance and increases the quality of the posture estimation. I. INTRODUCTIONTracking the complete articulation of a freely moving hand is a problem that is an ongoing research topic and has numerous applications in robotics. Many existing hand tracking solutions either require the user to wear cumbersome equipment, are expensive or are inadequate for real-time tracking. Some methods that use consumer-level depth sensors can detect the positions of individual fingers and provide a means for rough gesture interaction, but do not accurately reconstruct the user's hand posture with full degrees of freedom (DoFs).We have built a hand tracking system that uses a Kinect camera to estimate the full articulation of a user's bare hand in real-time. Our method is a generative approach that is based on fitting a virtual hand model to the 3D point cloud obtained from the Kinect sensor's depth camera. We estimate the hand articulation by finding the pose and posture parameters that minimize the error between the observed point cloud and the model surface using inverse kinematics. In doing so, we find the deformation of the 3D hand model that best approximates the observed state of the user's hand.A prevalent issue in tracking a highly articulated object like a hand is the number of DoFs that must be optimized. The analysis of hand synergies aims to identify high-level relationships in hand articulation in order to sensibly reduce the dimensionality of hand posture representations. We obtain such hand synergies through the principal component analysis of motion capture data and use them directly in the tracking process to reduce the parameter space and to naturally constrain the hand posture estimation. This paper extends our recent workshop paper [1] in several aspects: First, we speed up the inverse kinematics
Grasping and manual interaction for robots so far has largely been approached with an emphasis on physics and control aspects. Given the richness of human manual interaction, we argue for the consideration of the wider field of "manual intelligence" as a perspective for manual action research that brings the cognitive nature of human manual skills to the foreground. We briefly sketch part of a research agenda along these lines, argue for the creation of a manual interaction database as an important cornerstone of such an agenda, and describe the manual interaction lab recently set up at CITEC to realize this goal and to connect the efforts of robotics and cognitive science researchers towards making progress for a more integrated understanding of manual intelligence. From Robots to Manual IntelligenceProgress in mechatronics, sensing and control has made sophisticated robot hands possible whose potential for dexterous operation is at least beginning to approach the superb performance of human hands [1][2][3]. The increasing availability of these hands, together with sophisticated, physicsbased simulation software, has spurred a revival of the field of anthropomorphic hand control in robotics, whose ultimate goal is to replicate the abilities of human hands to handle everyday objects in flexible ways and in unprepared environments.The authors are cooperating within the Bielefeld Excellence Cluster Cognitive Interaction Technology (CITEC) and the Bielefeld Institute for Cognition and Robotics (CoR-Lab).
How much information with regard to identity and further individual participantcharacteristics are revealed by relatively short spatio-temporal motion trajectories of a person?We study this question by selecting a set of individual participant characteristics and analysingmotion captured trajectories of an exemplary class of familiar movements, namely handover of anobject to another person. The experiment is performed with different participants under different,predefined conditions. A selection of participant characteristics, such as the Big Five personalitytraits, gender, weight, or sportiness, are assessed and we analyse the impact of the three factor groups“participant identity”, “participant characteristics”, and “experimental conditions” on the observedhand trajectories. The participants’ movements are recorded via optical marker-based hand motioncapture. One participant, the giver, hands over an object to the receiver. The resulting time courses ofthree-dimensional positions of markers are analysed. Multidimensional scaling is used to projecttrajectories to points in a dimension-reduced feature space. Supervised learning is also applied.We find that “participant identity” seems to have the highest correlation with the trajectories, withfactor group “experimental conditions” ranking second. On the other hand, it is not possible to find acorrelation between the “participant characteristics” and the hand trajectory features.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.