2019
DOI: 10.3390/app9030445
|View full text |Cite
|
Sign up to set email alerts
|

Recognition of American Sign Language Gestures in a Virtual Reality Using Leap Motion

Abstract: Featured Application: We describe a system that uses a Leap Motion device to recognize the gestures performed by users while immersed in a Virtual Reality (VR). The developed system can be applied for the development of the VR applications that require identification of the user's hand gestures for control of virtual objects. Abstract:We perform gesture recognition in a Virtual Reality (VR) environment using data produced by the Leap Motion device. Leap Motion generates a virtual three-dimensional (3D) hand mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
42
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 85 publications
(42 citation statements)
references
References 47 publications
0
42
0
Order By: Relevance
“…Sideridis et al (2019) created a gesture recognition system for everyday gestures recorded with inertia measurement units, based on fast nearest neighbors, and support vector machine methods, whereas Yang and Sarkar (2006) chose to use an extension of HMMs. Vaitkevičius et al (2019) used also HMMs with the same purpose, gesture recognition, for the creation of virtual reality installations, as well as Williamson and Murray-Smith (2002), who used a combination of HMMs with a dynamic programming recognition algorithm, along with the granular synthesis method for gesture recognition with audio feedback. In a more industrial context, Yang et al (2007) used gesture spotting with HMMs to achieve efficient human-robot collaboration where realtime gesture recognition was performed with extended HMM methods like hierarchical HMMs (Li et al, 2017).…”
Section: Model-based Machine Learningmentioning
confidence: 99%
“…Sideridis et al (2019) created a gesture recognition system for everyday gestures recorded with inertia measurement units, based on fast nearest neighbors, and support vector machine methods, whereas Yang and Sarkar (2006) chose to use an extension of HMMs. Vaitkevičius et al (2019) used also HMMs with the same purpose, gesture recognition, for the creation of virtual reality installations, as well as Williamson and Murray-Smith (2002), who used a combination of HMMs with a dynamic programming recognition algorithm, along with the granular synthesis method for gesture recognition with audio feedback. In a more industrial context, Yang et al (2007) used gesture spotting with HMMs to achieve efficient human-robot collaboration where realtime gesture recognition was performed with extended HMM methods like hierarchical HMMs (Li et al, 2017).…”
Section: Model-based Machine Learningmentioning
confidence: 99%
“…It is expected that sensors for the acquisition of hands skeletal data will be improved soon. Therefore, work is underway to use them to recognize sign languages: American [15][16][17][18][19][20][21][22][23][24], Arabic [25][26][27][28], Australian [13], Indian [29][30][31], Mexican [32], Pakistani [33], and Polish [34].…”
Section: Related Workmentioning
confidence: 99%
“…Twenty-four static gestures of the American Finger Alphabet, shown ten times by twelve people, were recognized in [22]. Features based on the skeletal data returned by the LM controller were used.…”
Section: Related Workmentioning
confidence: 99%
“…Guillaume Plouffe et al [6] developed a natural gesture user interface that could track and recognize hand gesture based on depth data collected by a Kinect sensor. Aurelijus Vaitkeviźcius et al [14] presented a system that is capable of learning gestures by using the data from the Leap Motion device and the Hidden Markov classification algorithm. Although optical vision-based methods have good recognition performance, they are susceptible to illumination conditions and ambient infrared radiation [15].…”
Section: Introductionmentioning
confidence: 99%