2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020
DOI: 10.1109/iros45743.2020.9340706
|View full text |Cite
|
Sign up to set email alerts
|

From Human to Robot Everyday Activity

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 37 publications
0
4
0
Order By: Relevance
“…This study investigates EEG responses to distractions and hesitations in human-robot interaction. It integrates previous findings from the MAS Lab with the work of CSL and colleagues within the DFG CRC 1320 Everyday Activity Science and Engineering (EASE), where we provide unique and critical contextual background for robots based on the recording, processing, modeling, and interpretation of human activities, perceptions, and feedback [26]. For this purpose, biosignals resulting from the activity of the brain, the muscles, and the eyes, which are correlated to motion, communication, and other mental tasks, are recorded and interpreted to provide insight into diverse aspects of human behavior that enable them to masterfully perform everyday activities with little effort or attention [24].…”
Section: State Of the Artmentioning
confidence: 88%
See 1 more Smart Citation
“…This study investigates EEG responses to distractions and hesitations in human-robot interaction. It integrates previous findings from the MAS Lab with the work of CSL and colleagues within the DFG CRC 1320 Everyday Activity Science and Engineering (EASE), where we provide unique and critical contextual background for robots based on the recording, processing, modeling, and interpretation of human activities, perceptions, and feedback [26]. For this purpose, biosignals resulting from the activity of the brain, the muscles, and the eyes, which are correlated to motion, communication, and other mental tasks, are recorded and interpreted to provide insight into diverse aspects of human behavior that enable them to masterfully perform everyday activities with little effort or attention [24].…”
Section: State Of the Artmentioning
confidence: 88%
“…The following study would not have been conducted without LabLinking, since it builds on the complementary expertise and equipment of two laboratories: the Medical Assistance Systems Group (MAS) at Bielefeld University with its rich expertise in social robotics based on robots such as Pepper, Nao, or Flobi [19][20][21], and the Cognitive Systems Lab (CSL) at University of Bremen with vast experience in biosignal-adaptive cognitive systems [22] based on multimodal biosignal acquisition [23] and processing using machine learning methods [24], including the recording and interpretation of spoken communication [25] and high-density EEG in the context of intelligent robots and systems [26].…”
Section: Introductionmentioning
confidence: 99%
“…However, annotating naturalistic expressions and their contexts can be more complicated and labor-intensive than laboratory datasets. Advanced multimodal annotation tools [ 88 ] may help provide multimodal annotation to evaluate facial expressions together with other nonverbal modalities and rich contextual information to provide accurate portrayals of the interaction between facial expressions and contexts. In the following section, we discuss the novel applications of MLLMs that could circumvent the need for extensively annotated datasets, fostering further advancement in naturalistic affective research.…”
Section: Analyzing Naturalistic Facial Expressions With Deep Learningmentioning
confidence: 99%
“…In particular, we describe the benefits of the simultaneous recordings of biosignal data corresponding to everyday activities. These recordings leverage several modalities captured by various sensors, ranging from wearable low-cost acceleration sensors to high-dimensional costly functional brain imaging devices [7].…”
Section: Lablinking Ecosystemsmentioning
confidence: 99%