2019
DOI: 10.35940/ijeat.f9167.088619
|View full text |Cite
|
Sign up to set email alerts
|

Gesture To Speech Conversion using Flex sensors, MPU6050 and Python

Abstract: Communicating through hand gestures is one of the most common forms of non-verbal and visual communication adopted by speech impaired population all around the world. The problem existing at the moment is that most of the people are not able to comprehend hand gestures or convert them to the spoken language quick enough for the listener to understand. A large fraction of India’s population is speech impaired. In addition to this communication to sign language is not a very easy task. This problem demands a bet… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 12 publications
0
0
0
Order By: Relevance
“…The work obtained an exceptional accuracy of 93.4% for 75 static gestures. Similarly, Mehra et al had utilised flexible sensors and IMUs for an American Sign Language interpreter device, where the translation output was displayed on a computer monitor [15]. Another prototype of a wearable sign language interpreter, which was developed by Chong and Kim by using only six (6) IMUs mounted on the back of the hand and fingertips to detect and translate the hand motions [16].…”
Section: Introductionmentioning
confidence: 99%
“…The work obtained an exceptional accuracy of 93.4% for 75 static gestures. Similarly, Mehra et al had utilised flexible sensors and IMUs for an American Sign Language interpreter device, where the translation output was displayed on a computer monitor [15]. Another prototype of a wearable sign language interpreter, which was developed by Chong and Kim by using only six (6) IMUs mounted on the back of the hand and fingertips to detect and translate the hand motions [16].…”
Section: Introductionmentioning
confidence: 99%