2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops) 2019
DOI: 10.1109/percomw.2019.8730886
|View full text |Cite
|
Sign up to set email alerts
|

Human-Robot Interaction with Smart Shopping Trolley Using Sign Language: Data Collection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(3 citation statements)
references
References 18 publications
0
3
0
Order By: Relevance
“…Or second, if we need to recognize continuous speech (phrases or sentences). Based on this information we have to choose different size of a database, different methods for features extraction and model training (Ryumin et al, 2019).…”
Section: Backgrounds and Related Researchmentioning
confidence: 99%
“…Or second, if we need to recognize continuous speech (phrases or sentences). Based on this information we have to choose different size of a database, different methods for features extraction and model training (Ryumin et al, 2019).…”
Section: Backgrounds and Related Researchmentioning
confidence: 99%
“…This database was collected using the developed automatic 3D video stream recording system with the Microsoft Kinect 2.0 rangefinder sensor. The overall architecture of the developed system (MulGesRecDB) is presented in Figure 2 (Ryumin et al, 2019).…”
Section: Datasetmentioning
confidence: 99%
“…Computer vision and machine learning are the main means how to process such data. They can extend the devices and applications with intelligent algorithms that understand what users are doing and how they interact with their surroundings [1]. In general, hand pose estimation is being addressed in many fields -robotics [2], medicine, automotive, or sign language processing [3].…”
Section: Introductionmentioning
confidence: 99%