2022
DOI: 10.3390/app13010453
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Machine Learning Based Two-Way Communication System for Deaf and Mute

Abstract: Deaf and mute people are an integral part of society, and it is particularly important to provide them with a platform to be able to communicate without the need for any training or learning. These people rely on sign language, but for effective communication, it is expected that others can understand sign language. Learning sign language is a challenge for those with no impairment. Another challenge is to have a system in which hand gestures of different languages are supported. In this manuscript, a system i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 39 publications
0
2
0
Order By: Relevance
“…This research is the continuation of the previously published work [1]. In this manuscript, the capability of the previous system is significantly enhanced, and new features include the new combined sign language datasets, American Sign Language (ASL), Pakistani Sign Language (PSL) and Spanish Sign Language (SSL).…”
Section: Introductionmentioning
confidence: 77%
See 1 more Smart Citation
“…This research is the continuation of the previously published work [1]. In this manuscript, the capability of the previous system is significantly enhanced, and new features include the new combined sign language datasets, American Sign Language (ASL), Pakistani Sign Language (PSL) and Spanish Sign Language (SSL).…”
Section: Introductionmentioning
confidence: 77%
“…The scope of this work includes ML, CNN, the use of multiple Sign Language datasets, and COTS hardware devices. The system presented in this manuscript is the continuation of previously published work [1]. The system uses three sign language datasets combined into one.…”
Section: Discussionmentioning
confidence: 99%