Abstract:Abstract-We present an automated assembly system that directs the actions of a team of heterogeneous robots in the completion of an assembly task. From an initial user-supplied geometric specification, the system applies reasoning about the geometry of individual parts in order to deduce how they fit together. The task is then automatically transformed to a symbolic description of the assembly-a sort of blueprint. A symbolic planner generates an assembly sequence that can be executed by a team of collaborating… Show more
“…Our assembly system consists of a team of KUKA youBots, which collaborate to assemble IKEA furniture, originally described in Knepper et al [11]. The robots receive assembly instructions encoded in a STRIPS-style planning language.…”
Section: Assembling Furniturementioning
confidence: 99%
“…Robotic capabilities such as robust manipulation, accurate perception, and fast planning algorithms have led to recent successes such as robots that can fold laundry [15], cook dinner [1], and assemble furniture [11]. However, when robots execute these tasks autonomously, failures often occur, for example failing to pick up an object due to perceptual ambiguity or an inaccurate grasp.…”
Abstract-Robots inevitably fail, often without the ability to recover autonomously. We demonstrate an approach for enabling a robot to recover from failures by communicating its need for specific help to a human partner using natural language. Our approach automatically detects failures, then generates targeted spoken-language requests for help such as "Please give me the white table leg that is on the black table." Once the human partner has repaired the failure condition, the system resumes full autonomy. We present a novel inverse semantics algorithm for generating effective help requests. In contrast to forward semantic models that interpret natural language in terms of robot actions and perception, our inverse semantics algorithm generates requests by emulating the human's ability to interpret a request using the Generalized Grounding Graph (G 3 ) framework. To assess the effectiveness of our approach, we present a corpusbased online evaluation, as well as an end-to-end user study, demonstrating that our approach increases the effectiveness of human interventions compared to static requests for help.
“…Our assembly system consists of a team of KUKA youBots, which collaborate to assemble IKEA furniture, originally described in Knepper et al [11]. The robots receive assembly instructions encoded in a STRIPS-style planning language.…”
Section: Assembling Furniturementioning
confidence: 99%
“…Robotic capabilities such as robust manipulation, accurate perception, and fast planning algorithms have led to recent successes such as robots that can fold laundry [15], cook dinner [1], and assemble furniture [11]. However, when robots execute these tasks autonomously, failures often occur, for example failing to pick up an object due to perceptual ambiguity or an inaccurate grasp.…”
Abstract-Robots inevitably fail, often without the ability to recover autonomously. We demonstrate an approach for enabling a robot to recover from failures by communicating its need for specific help to a human partner using natural language. Our approach automatically detects failures, then generates targeted spoken-language requests for help such as "Please give me the white table leg that is on the black table." Once the human partner has repaired the failure condition, the system resumes full autonomy. We present a novel inverse semantics algorithm for generating effective help requests. In contrast to forward semantic models that interpret natural language in terms of robot actions and perception, our inverse semantics algorithm generates requests by emulating the human's ability to interpret a request using the Generalized Grounding Graph (G 3 ) framework. To assess the effectiveness of our approach, we present a corpusbased online evaluation, as well as an end-to-end user study, demonstrating that our approach increases the effectiveness of human interventions compared to static requests for help.
“…A procedure derived from a symbolic planner guides the automated furniture assembly, as in the work of Knepper et al [2]. Each step of the plan comprises an action (pick up the table leg), a set of preconditions (hand is empty, robot is near table leg), and a set of postconditions (table leg is in hand).…”
Abstract-We describe an approach for enabling robots to recover from failures by asking for help from a human partner. For example, if a robot fails to grasp a needed part during a furniture assembly task, it might ask a human partner to "Please hand me the white table leg near you." After receiving the part from the human, the robot can recover from its grasp failure and continue the task autonomously. This paper describes an approach for enabling a robot to automatically generate a targeted natural language request for help from a human partner. The robot generates a natural language description of its need by minimizing the entropy of the command with respect to its model of language understanding for the human partner, a novel approach to grounded language generation. Our long-term goal is to compare targeted requests for help to more open-ended requests where the robot simply asks "Help me," demonstrating that targeted requests are more easily understood by human partners.
“…Our assembly system, IkeaBot, comprises a team of KUKA youBots that collaborate to assemble IKEA furniture, originally described in Knepper et al (2013). The robots receive assembly instructions encoded in a STRIPS-style planning language called ABPL.…”
Section: Assembling Furniturementioning
confidence: 99%
“…Robotic capabilities such as robust manipulation, accurate perception, and fast planning algorithms have led to recent successes such as robots that can fold laundry (MaitinShepard et al 2010), cook dinner (Bollini et al 2012), and assemble furniture (Knepper et al 2013). However, when robots execute these tasks autonomously, failures often occur, for example failing to pick up an object due to perceptual ambiguity or an inaccurate grasp.…”
Robots inevitably fail, often without the ability to recover autonomously. We demonstrate an approach for enabling a robot to recover from failures by communicating its need for specific help to a human partner using natural language. Our approach automatically detects failures, then generates targeted spoken-language requests for help such as "Please give me the white table leg that is on the black table." Once the human partner has repaired the failure condition, the system resumes full autonomy. We present a novel inverse semantics algorithm for generating effective help requests. In contrast to forward semantic models that interpret natural language in terms of robot actions and perception, our inverse semantics algorithm generates requests by emulating the human's ability to interpret a request using the Generalized Grounding Graph (G 3 ) framework. To assess the effectiveness of our approach, we present a corpusbased online evaluation, as well as an end-to-end user study, demonstrating that our approach increases the effectiveness of human interventions compared to static requests for help.Ross A. Knepper and Stefanie Tellex have contributed equally to this paper. This is one of several papers published in Autonomous Robots comprising the "Special Issue on Robotics Science and Systems".
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.