2019 International Conference on Sustainable Technologies for Industry 4.0 (STI) 2019
DOI: 10.1109/sti47673.2019.9067974
|View full text |Cite
|
Sign up to set email alerts
|

A New Benchmark on American Sign Language Recognition using Convolutional Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 37 publications
(21 citation statements)
references
References 13 publications
0
21
0
Order By: Relevance
“…We separate the classic methods from the DL methods in the evaluation for simplicity. In our search for datasets, we verified that the datasets created by [5], [11], [12] were not available. Therefore, we compare our results with previous works in different sign languages such as Irish Sign Language [28], JSL [13], and others [4], [8], [29]- [31].…”
Section: Methodsmentioning
confidence: 98%
See 3 more Smart Citations
“…We separate the classic methods from the DL methods in the evaluation for simplicity. In our search for datasets, we verified that the datasets created by [5], [11], [12] were not available. Therefore, we compare our results with previous works in different sign languages such as Irish Sign Language [28], JSL [13], and others [4], [8], [29]- [31].…”
Section: Methodsmentioning
confidence: 98%
“…Both Rahman et al [5] and Sruthi et al [11] propose a novel Deep Neural Network (DNN) for SLR. Rahman et al [5] apply the DNN in four different American sign language (ASL) datasets containing 29 classes, and they obtain 100% accuracy.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Within the framework of this research, the results of such studies are of interest from the point of view of information visualization, particularly the accuracy of static and dynamic gestures, the position of the human hand, the features of a person's lip contour, etc. Instruction, July 2022 • Vol.15, No.3 If we consider the use of gestures as a way of interacting with a computer system, algorithms for converting and interpreting gesture information, as well as the corresponding software implementation, then the software tools created are associated with the use of a large number of sensors, sensors, joysticks, trackballs or touch screens, as well as complex control systems (Karpov, 2013;Myasoedova et al, 2020;Rahman et al, 2019;Ryumin et al, 2020). As Karpov notes in his works, "the currently widely used graphical and textual interfaces are focused on experienced users, and the available research practically does not touch upon the issues of human-machine communication for persons with disabilities" (Karpov, 2013).…”
Section: Literature Reviewmentioning
confidence: 99%