2017
DOI: 10.3844/jcssp.2017.337.354
|View full text |Cite
|
Sign up to set email alerts
|

Arabic Static and Dynamic Gestures Recognition Using Leap Motion

Abstract: Across the world, several millions of people use sign language as their main way of communication with their society, daily they face a lot of obstacles with their families, teachers, neighbours, employers. According to the most recent statistics of World Health Organization, there are 360 million persons in the world with disabling hearing loss i.e. (5.3% of the world's population), around 13 million in the Middle East. Hence, the development of automated systems capable of translating sign languages into wor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 37 publications
(11 citation statements)
references
References 18 publications
0
10
0
1
Order By: Relevance
“…It is expected that sensors for the acquisition of hands skeletal data will be improved soon. Therefore, work is underway to use them to recognize sign languages: American [15][16][17][18][19][20][21][22][23][24], Arabic [25][26][27][28], Australian [13], Indian [29][30][31], Mexican [32], Pakistani [33], and Polish [34].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…It is expected that sensors for the acquisition of hands skeletal data will be improved soon. Therefore, work is underway to use them to recognize sign languages: American [15][16][17][18][19][20][21][22][23][24], Arabic [25][26][27][28], Australian [13], Indian [29][30][31], Mexican [32], Pakistani [33], and Polish [34].…”
Section: Related Workmentioning
confidence: 99%
“…In [27], forty-four static gestures (twenty-eight letters, ten numbers, and sixteen words) from the Arabic Sign Language were recognized. Two variants of the feature vector consisting of 85 and 70 scalar values measured by the LM controller were considered.…”
Section: Related Workmentioning
confidence: 99%
“…the data [11]. A preprocessing phase is necessary to improve the quality of the images and to make the later feature extraction phase more reliable.…”
Section: Pre Processing Imagesmentioning
confidence: 99%
“…Fok et al [26] achieved an average recognition rate of 93.14% using data fusion of two Leap Motion sensors and the Hidden Markov Model (HMM) classifier trained on orientation and distance ratio features (relative orientations of distal phalanges to the orientation of the palm; the ratio of the distance between a fingertip to the palm to the sum of distances between finger tips and the palm; the ratio of the distance between finger tips to the total distance among finger tips). Hisham and Hamouda [27] achieved accuracy 97.4% and 96.4% respectively on Arabic signs while using palm and bone feature sets and Dynamic Time Wrapping (DTW) for dynamic gesture recognition. Lu et al [28] used the Hidden Conditional Neural Field (HCNF) classifier to recognize dynamic hand gestures, achieving 89.5% accuracy on two dynamic hand gesture datasets.…”
Section: Introductionmentioning
confidence: 99%