2020 IEEE International Conference on Robotics and Automation (ICRA) 2020
DOI: 10.1109/icra40945.2020.9196856
|View full text |Cite
|
Sign up to set email alerts
|

PARC: A Plan and Activity Recognition Component for Assistive Robots

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…Regarding the speed, these algorithms have been executed in real time at 20 FPS [ 51 ]. For their implementation, computers with basic [ 51 ] and advanced [ 50 ] hardware were used, as well as the robotic platforms Brian [ 47 ] and Pepper [ 49 ].…”
Section: Discussion and Conclusionmentioning
confidence: 99%
See 1 more Smart Citation
“…Regarding the speed, these algorithms have been executed in real time at 20 FPS [ 51 ]. For their implementation, computers with basic [ 51 ] and advanced [ 50 ] hardware were used, as well as the robotic platforms Brian [ 47 ] and Pepper [ 49 ].…”
Section: Discussion and Conclusionmentioning
confidence: 99%
“…A double-layer network (CNN and LSTM) was implemented to recognize the following activities: brushing teeth, chopping, drinking water, opening pill container, relaxing on the couch, rinsing the mouth with water, stirring, talking on the couch, talking on the phone, wearing contact lenses, working on the computer, and writing on a whiteboard. Additionally, Massardi et al in [ 50 ] presented the PARC algorithm for an assistance robot to recognize and help people with cognitive or physical disabilities to perform ADLs: (a) an RGB-D camera was implemented to extract the body and objects utilizing the YOLO framework; (b) they calculated whether an object was in the person’s hand to identify the activity; (c) they used a particle filter algorithm to plan the sequence of tasks that the user must perform to complete the activity.…”
Section: Algorithms Used For the Bodymentioning
confidence: 99%
“…In [46], by means of RGB-D cameras, an algorithm implemented on a Pepper robot aimed to identify daily activities in a person's life, such as the following: talking on the phone, drinking water, rinsing one's mouth with water, writing on a blackboard, brushing one's teeth, opening a pill container, stirring, relaxing on the couch, working on a computer, and wearing contact lenses. Jean Massardi [47] implemented an algorithm called PARC on a personal assistance robot which aimed to help people with disabilities to complete daily activities successfully.…”
Section: Assistive Robotsmentioning
confidence: 99%
“…with people [44] Recognize gestures Assistance Real Time RGB [45] Face detection Assistance 10 fps N/A [46] Home care Assistance, Rehabilitation N/A RGB-D [47] Recognition of activities Assistance, Rehabilitation Real Time RGB-D An analysis between the years 1971 and 2022 is performed in Figure 6 to show th level of interest in autonomous assistive robots in the database provided by Scopus. Sin 2003, the level of interest in this field has increased.…”
Section: Refsmentioning
confidence: 99%