2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) 2019
DOI: 10.1109/ismar-adjunct.2019.00-30
|View full text |Cite
|
Sign up to set email alerts
|

LE-HGR: A Lightweight and Efficient RGB-Based Online Gesture Recognition Network for Embedded AR Devices

Abstract: Figure 1: LE-HGR Framework. The overall framework of the proposed gesture recognition algorithm. The framework contains four sub-modules: Hand candidate detector, multi-task CNN, hand trace mapping, and trace sequence network. ABSTRACTOnline hand gesture recognition (HGR) techniques are essential in augmented reality (AR) applications for enabling natural humanto-computer interaction and communication. In recent years, the consumer market for low-cost AR devices has been rapidly growing, while the technology m… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 37 publications
(55 reference statements)
0
3
0
Order By: Relevance
“…Chalasani et al [7] generated augmented data by using a green screen to capture hand gestures and then overlaid these on different backgrounds. Xie et al [49] performed simultaneous detection of hands and tracking of keypoints to support gesture recognition while Wang et al [47] examined online gesture recognition for interaction with a vehicle HUD. However, all these prior studies use image data.…”
Section: Online Gesture Recognitionmentioning
confidence: 99%
“…Chalasani et al [7] generated augmented data by using a green screen to capture hand gestures and then overlaid these on different backgrounds. Xie et al [49] performed simultaneous detection of hands and tracking of keypoints to support gesture recognition while Wang et al [47] examined online gesture recognition for interaction with a vehicle HUD. However, all these prior studies use image data.…”
Section: Online Gesture Recognitionmentioning
confidence: 99%
“…Specifically, in [4], [11], [15]- [19], the authors use information from depth sensors as input to give out hand gesture predictions. In [4], [8], [20]- [25], the skeleton information is used as input. Recently, Leap Motion has also been considered as an essential sensor for hand gesture recognition [18], [26].…”
Section: Literature Reviewmentioning
confidence: 99%
“…AI has a crucial role in processing the voluminous data gathered from IoMT devices. Embedded AI algorithms will empower customized AR experiences of our daily lives ranging from personalized dietary plans to artistic virtual overlays [174]. This convergence will revolutionize the experience of using Smart AR Glasses in near future.…”
mentioning
confidence: 99%