In this work, we present an active tactile perception approach for contour following based on a probabilistic framework. Tactile data were collected using a biomimetic fingertip sensor. We propose a control architecture that implements a perception-action cycle for the exploratory procedure, which allows the fingertip to react to tactile contact whilst regulating the applied contact force. In addition, the fingertip is actively repositioned to an optimal position to ensure accurate perception. The method is trained off-line and then the testing performed on-line based on contour following around several different test shapes. We then implement object recognition based on the extracted shapes. Our active approach is compared with a passive approach, demonstrating that active perception is necessary for successful contour following and hence shape recognition.
Motivated by the impact of superresolution methods for imaging, we undertake a detailed and systematic analysis of localization acuity for a biomimetic fingertip and a flat region of tactile skin. We identify three key factors underlying superresolution that enable the perceptual acuity to surpass the sensor resolution: 1) the sensor is constructed with multiple overlapping, broad but sensitive receptive fields; 2) the tactile perception method interpolates between receptors (taxels) to attain subtaxel acuity; and 3) active perception ensures robustness to unknown initial contact location. All factors follow from active Bayesian perception applied to biomimetic tactile sensors with an elastomeric covering that spreads the contact over multiple taxels. In consequence, we attain extreme superresolution with a 35-fold improvement of localization acuity (0.12 mm) over sensor resolution (4 mm). We envisage that these principles will enable cheap high-acuity tactile sensors that are highly customizable to suit their robotic use. Practical applications encompass any scenario where an end-effector must be placed accurately via the sense of touch.Index Terms-Biomimetics, force and tactile sensing.
Please cite this article as: U. Martinez-Hernandez, et al., Active sensorimotor control for tactile exploration, Robotics and Autonomous Systems (2016), http://dx.doi.org/10. 1016/j.robot.2016.09.014 This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain. Active sensorimotor control for tactile exploration AbstractIn this paper, we present a novel and robust Bayesian approach for autonomous active exploration of unknown objects using tactile perception and sensorimotor control. Despite recent advances in tactile sensing, robust active exploration remains a challenging problem, which is a major hurdle to the practical deployment of tactile sensors in robots.Our proposed approach is based on a Bayesian perception method that actively controls the sensor with local small repositioning movements to reduce perception uncertainty, followed by explorative movements based on the outcome of each perceptual decision making step. Two sensorimotor control strategies are proposed for improving the accuracy and speed of the active exploration that weight the evidence from previous exploratory steps through either a weighted prior or weighted posterior. The methods are validated both off-line and in real-time on a contour following exploratoryprocedure. Results clearly demonstrate improvements in both accuracy and exploration time when using the proposed active methods compared to passive perception. Our work demonstrates that active perception has the potential to enable robots to perform robust autonomous tactile exploration in natural environments.
Abstract-In this paper, we propose that active Bayesian perception has a general role for Simultaneous Object Localization and IDentification (SOLID), or deciding where and what. We test this claim using a biomimetic fingertip to perceive object identity via surface shape at uncertain contact locations. Our method for active Bayesian perception combines decision making by threshold crossing of the posterior belief with a sensorimotor loop that actively controls sensor location based on those beliefs. Our findings include: (i) active perception with a fixation control strategy gives an order-of-magnitude improvement in acuity over passive perception without sensorimotor feedback; (ii) perceptual acuity improves as the active control requires less belief to make a relocation decision; and (iii) relocation noise further improves acuity. The best method has aspects that resemble animal perception, supporting wide applicability of these findings.
In this paper, we propose that active perception will help attain autonomous robotics in unstructured environments by giving robust perception. We test this claim with a biomimetic fingertip that senses surface texture under a range of contact depths. We compare the performance of passive Bayesian perception with a novel approach for active perception that includes a sensorimotor loop for controlling sensor position. Passive perception at a single depth gave poor results, with just 0.2 mm uncertainty impairing performance. Extending passive perception over a range of depths gave non-robust performance. Only active perception could give robust, accurate performance, with the sensorimotor feedback compensating the position uncertainty. We expect that these results will extend to other stimuli, so that active perception will offer a general approach to robust perception in unstructured environments.
One of the defining characteristics of human cognition is our outstanding capacity to cooperate. A central requirement for cooperation is the ability to establish a "shared plan" -which defines the interlaced actions of the two cooperating agents -in real time, and even to negotiate this shared plan during its execution.In the current research we identify the requirements for cooperation, extending our earlier work in this area. These requirements include the ability to negotiate a shared plan using spoken language, to learn new component actions within that plan, based on visual observation and kinesthetic demonstration, and finally to coordinate all of these functions in real time. We present a cognitive system that implements these requirements, and demonstrate the system's ability to allow a Nao humanoid robot to learn a non-trivial cooperative task in real-time. We further provide a concrete demonstration of how the real-time learning capability can be easily deployed on different platform, in this case the iCub humanoid. The results are considered in the context of how the development of language in the human infant provides a powerful lever in the development of cooperative plans from lower-level sensorimotor capabilities.Index Terms-cooperation, humanoid robot, spoken language interaction, shared plans, situated and social learning.
In this paper, a novel approach for recognition of walking activities and gait events with wearable sensors is presented. This approach, called adaptive Bayesian inference system (BasIS), uses a probabilistic formulation with a sequential analysis method, for recognition of walking activities performed by participants. Recognition of gait events, needed to identify the state of the human body during the walking activity, is also provided by the proposed method. In addition, the BasIS system includes an adaptive action-perception method for the prediction of gait events. The adaptive approach uses the knowledge gained from decisions made over time by the inference system. The action-perception method allows the BasIS system to autonomously adapt its performance, based on the evaluation of its own predictions and decisions made over time. The proposed approach is implemented in a layered architecture and validated with the recognition of three walking activities:level-ground, ramp ascent and ramp descent. The validation process employs real data from three inertial measurements units attached to the thigh, shanks and foot of participants while performing walking activities. The experiments show that mean decision times of 240 ms and 40 ms are needed to achieve mean accuracies of 99.87% and 99.82% for recognition of walking activities and gait events, respectively. The validation experiments also show that the performance, in accuracy and speed, is not significantly affected when noise is added to sensor measurements. These results show that the proposed adaptive recognition system is accurate, fast and robust to sensor noise, but also capable to adapt its own performance over time. Overall, the adaptive BasIS system demonstrates to be a robust and suitable computational approach for the intelligent recognition of activities of daily living using wearable sensors.
ReuseUnless indicated otherwise, fulltext items are protected by copyright with all rights reserved. The copyright exception in section 29 of the Copyright, Designs and Patents Act 1988 allows the making of a single copy solely for the purpose of non-commercial research or private study within the limits of fair dealing. The publisher or other rights-holder may allow further reproduction and re-use of this version -refer to the White Rose Research Online record for this item. Where records identify the publisher as the copyright holder, users can verify any specific terms of use on the publisher's website. TakedownIf you consider content in White Rose Research Online to be in breach of UK law, please notify us by emailing eprints@whiterose.ac.uk including the URL of the record and the reason for the withdrawal request.Embodied hyperacuity from Bayesian perception: Shape and position discrimination with an iCub fingertip sensor Nathan F. Lepora, Uriel Martinez-Hernandez, Hector Barron-Gonzalez, Mat Evans, Giorgio Metta, Tony J. Prescott Abstract-Recent advances in modeling animal perception has motivated an approach of Bayesian perception applied to biomimetic robots. This study presents an initial application of Bayesian perception on an iCub fingertip sensor mounted on a dedicated positioning robot. We systematically probed the test system with five cylindrical stimuli offset by a range of positions relative to the fingertip. Testing the real-time speed and accuracy of shape and position discrimination, we achieved sub-millimeter accuracy with just a few taps. This result is apparently the first explicit demonstration of perceptual hyperacuity in robot touch, in that object positions are perceived more accurately than the taxel spacing. We also found substantial performance gains when the fingertip can reposition itself to avoid poor perceptual locations, which indicates that improved robot perception could mimic active perception in animals.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.