2015
DOI: 10.1016/j.robot.2015.03.002
|View full text |Cite
|
Sign up to set email alerts
|

A unified multimodal control framework for human–robot interaction

Abstract: In human-robot interaction, the robot controller must reactively adapt to sudden changes in the environment (due to unpredictable human behaviour). This often requires operating different modes, and managing sudden signal changes from heterogeneous sensor data. In this paper, we present a multimodal sensor-based controller, enabling a robot to adapt to changes in the sensor signals (here, changes in the human collaborator behaviour). Our controller is based on a unified task formalism, and in contrast with cla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
3
3

Relationship

2
8

Authors

Journals

citations
Cited by 47 publications
(23 citation statements)
references
References 40 publications
0
23
0
Order By: Relevance
“…Automatic manipulation of soft materials is a fundamental -yet open -research problem in robotics, despite its numerous applications (e.g., in the food industry). For this, we will profit from our previous works on dual arm control [2], and on merging vision and force for manipulation [14].…”
Section: Discussionmentioning
confidence: 99%
“…Automatic manipulation of soft materials is a fundamental -yet open -research problem in robotics, despite its numerous applications (e.g., in the food industry). For this, we will profit from our previous works on dual arm control [2], and on merging vision and force for manipulation [14].…”
Section: Discussionmentioning
confidence: 99%
“…One application of sensor integration is the screwing task proposed by Shauri et al [221], where the trajectory of the robot arm is controlled based on the measurement from the vision system and the robot hand configuration is adjusted based on the pressure/force data. The vision/force integration is also explored in the context of collaborative screw fastening [40], where the data from Kinect, black/white camera and force sensor, deployed to track human hand, screw and contact force, respectively, are used alternately for robot control. De Gea Fernández et al [62] extended sensor data integration from IMU, RGB-D (red, green, blue, depth) camera and laser scanner to robot whole-body control.…”
Section: Human-robot Collaborative Assemblymentioning
confidence: 99%
“…for A t = [∂f /∂s t ][∂g/∂x t ] ∈ R m×n as the Jacobian matrix of the system (also known as the interaction matrix in the sensor servoing literature [17]), whose elements depend on the instantaneous configuration x t . The sensorimotor control problem consists in coordinating the motor actions with the feedback signals such that a desired sensory behaviour is achieved.…”
Section: Sensorimotor Control Problemmentioning
confidence: 99%