2015
DOI: 10.5772/61819
|View full text |Cite
|
Sign up to set email alerts
|

Vision-based Recognition of Activities by a Humanoid Robot

Abstract: We present an autonomous assistive robotic system for human activity recognition from video sequences. Due to the large variability inherent to video capture from a nonfixed robot (as opposed to a fixed camera), as well as the robot's limited computing resources, implementation has been guided by robustness to this variability and by memory and computing speed efficiency. To accommodate motion speed variability across users, we encode motion using dense interest point trajectories. Our recognition model harnes… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(12 citation statements)
references
References 22 publications
0
12
0
Order By: Relevance
“…Other potential applications include systems for cognitive personal assistance such as user support in complex environments and basic support for elderly or disabled people making domestic tasks and activities of daily life that would be increasingly challenging and easier for them. 3,34 As the approach presented in this article works remotely, meaning that there is no need to go where the robot is to start the classification, we can think of even more potential applications that will reduce the need for the user to move as most robots will be remotely controlled using a computer or other device. 34 …”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Other potential applications include systems for cognitive personal assistance such as user support in complex environments and basic support for elderly or disabled people making domestic tasks and activities of daily life that would be increasingly challenging and easier for them. 3,34 As the approach presented in this article works remotely, meaning that there is no need to go where the robot is to start the classification, we can think of even more potential applications that will reduce the need for the user to move as most robots will be remotely controlled using a computer or other device. 34 …”
Section: Resultsmentioning
confidence: 99%
“…Human-robot interaction is becoming more common; therefore, robot perception of the world has evolved into many modalities such as vision, speech, touch, among others. One example for artificial vision perception is presented by El-Yacoubi et al, 3 an autonomous assistive robotic system for human activity recognition from video sequences that integrates machine vision algorithms into an NAO robot, allowing it to recognize activities of daily life (sitting down, falling down, opening a door, applauding) performed by a person in a smart home.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…'Cooking', 'Sleeping' and 'Falling' are possible activities. The robot is able to recognize what the user is doing from vision [13,14]. However, these image based techniques can be a bit inaccurate and require a lot of training.…”
Section: B Cognitionmentioning
confidence: 99%
“…But, as expressed in Section III, it encounters difficulties. We enhance the vision process described presented in [13,14]. In brief, it works following four steps that rely on several supervised learning algorithm:…”
Section: ) Activity Recognitionmentioning
confidence: 99%