2018
DOI: 10.11591/ijece.v8i6.pp4693-4704
|View full text |Cite
|
Sign up to set email alerts
|

Deep Belief Networks for Recognizing Handwriting Captured by Leap Motion Controller

Abstract: Leap Motion controller is an input device that can track hands and fingers position quickly and precisely. In some gaming environment, a need may arise to capture letters written in the air by Leap Motion, which cannot be directly done right now. In this paper, we propose an approach to capture and recognize which letter has been drawn by the user with Leap Motion. This approach is based on Deep Belief Networks (DBN) with Resilient Backpropagation (Rprop) fine-tuning. To assess the performance of our proposed … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 16 publications
0
6
0
Order By: Relevance
“…These wearable devices were difficult (and sometimes impossible) to handle. The recent Kinect [27,29,41], RealSense [22,23], and Leap Motion [12,21,25,28,42,43] devices are vision-based and very user-friendly and do not require additional devices. The RealSense camera is straightforward to ease.…”
Section: Related Work and The Dataset Of Air-writingmentioning
confidence: 99%
See 2 more Smart Citations
“…These wearable devices were difficult (and sometimes impossible) to handle. The recent Kinect [27,29,41], RealSense [22,23], and Leap Motion [12,21,25,28,42,43] devices are vision-based and very user-friendly and do not require additional devices. The RealSense camera is straightforward to ease.…”
Section: Related Work and The Dataset Of Air-writingmentioning
confidence: 99%
“…We used four air-writing datasets: the RTD [22], RTC [23], smart-band [24], and Abas datasets [25]. The RTD and RTC vision-based datasets were collected using an Intel RealSense sr300 camera.…”
Section: The Datasetsmentioning
confidence: 99%
See 1 more Smart Citation
“…Most of the research has been done by using the trajectory information directly, i.e., using the temporal information. On the other hand, Setiawan and Pulungan [30] proposed a 2D mapping approach, in which trajectories were collected by the Leap Motion device and converted to the 2D image matrix like the popular MNIST dataset. Nowadays, WiFi and Radar-based technology have become popular.…”
Section: Related Workmentioning
confidence: 99%
“…It consists of three parts, two cameras, LEDs, and a microcontroller in which the cameras capture successive images of the hand and then they are passed to the controller to process the images and extract spatial information of the hand and fingers. Leap Motion has been used in several projects to recognize hand gestures [8][9][10][11][12][13]. Although approaching the dexterity of a human hand to control robotic arm is difficult, LM was used to do this task in [14][15][16], where hand gestures have been translated to joint angles to perform a specific task.…”
Section: Introductionmentioning
confidence: 99%