Abstract-We describe the software components of a robotics system designed to autonomously grasp objects and perform dexterous manipulation tasks with only high-level supervision. The system is centered on the tight integration of several core functionalities, including perception, planning and control, with the logical structuring of tasks driven by a Behavior Tree architecture. The advantage of the implementation is to reduce the execution time while integrating advanced algorithms for autonomous manipulation. We describe our approach to 3-D perception, real-time planning, force compliant motions, and audio processing. Performance results for object grasping and complex manipulation tasks of in-house tests and of an independent evaluation team are presented.
We have developed the CHIMP (CMU Highly Intelligent Mobile Platform) robot as a platform for executing complex tasks in dangerous, degraded, human‐engineered environments. CHIMP has a near‐human form factor, work‐envelope, strength, and dexterity to work effectively in these environments. It avoids the need for complex control by maintaining static rather than dynamic stability. Utilizing various sensors embedded in the robot's head, CHIMP generates full three‐dimensional representations of its environment and transmits these models to a human operator to achieve latency‐free situational awareness. This awareness is used to visualize the robot within its environment and preview candidate free‐space motions. Operators using CHIMP are able to select between task, workspace, and joint space control modes to trade between speed and generality. Thus, they are able to perform remote tasks quickly, confidently, and reliably, due to the overall design of the robot and software. CHIMP's hardware was designed, built, and tested over 15 months leading up to the DARPA Robotics Challenge. The software was developed in parallel using surrogate hardware and simulation tools. Over a six‐week span prior to the DRC Trials, the software was ported to the robot, the system was debugged, and the tasks were practiced continuously. Given the aggressive schedule leading to the DRC Trials, development of CHIMP focused primarily on manipulation tasks. Nonetheless, our team finished 3rd out of 16. With an upcoming year to develop new software for CHIMP, we look forward to improving the robot's capability and increasing its speed to compete in the DRC Finals.
Abstract-We address the problem of grasping everyday objects that are small relative to an anthropomorphic hand, such as pens, screwdrivers, cellphones, and hammers from their natural poses on a support surface, e.g., a table top. In such conditions, state of the art grasp generation techniques fail to provide robust, achievable solutions due to either ignoring or trying to avoid contact with the support surface. In contrast, we show that contact with support surfaces is critical for grasping small objects. This also conforms with our anecdotal observations of human grasping behaviors. We develop a simple closed-loop hybrid controller that mimics this interactive, contact-rich strategy by a position-force, pre-grasp and landing strategy for finger placement. The approach uses a compliant control of the hand during the grasp and release of objects in order to preserve safety. We conducted extensive grasping experiments on a variety of small objects with similar shape and size. The results demonstrate that our approach is robust to localization uncertainties and applies to many everyday objects.
BackgroundRecent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object.MethodsTwo human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps.ResultsBoth subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control.ConclusionsIntegration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users.Trial registrationNCT01364480 and NCT01894802.Electronic supplementary materialThe online version of this article (doi:10.1186/s12984-016-0134-9) contains supplementary material, which is available to authorized users.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.