2020
DOI: 10.3390/s20143930
|View full text |Cite
|
Sign up to set email alerts
|

Evaluation of Full-Body Gestures Performed by Individuals with Down Syndrome: Proposal for Designing User Interfaces for All Based on Kinect Sensor

Abstract: The ever-growing and widespread use of touch, face, full-body, and 3D mid-air gesture recognition sensors in domestic and industrial settings is serving to highlight whether interactive gestures are sufficiently inclusive, and whether or not they can be executed by all users. The purpose of this study was to analyze full-body gestures from the point of view of user experience using the Microsoft Kinect sensor, to identify which gestures are easy for individuals living with Down syndrome. With this information,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 43 publications
0
4
0
Order By: Relevance
“…The experiments proved to elicit functional performances, social behaviors, and emotional responses. The authors concluded that further empirical research is needed in this area as the experiment was limited to health care [19], [20].…”
Section: Literature Reviewmentioning
confidence: 99%
“…The experiments proved to elicit functional performances, social behaviors, and emotional responses. The authors concluded that further empirical research is needed in this area as the experiment was limited to health care [19], [20].…”
Section: Literature Reviewmentioning
confidence: 99%
“…Gesture recognition is a topic in computer science and language technology [ 1 ]. As an extremely efficient non-verbal interaction method, gesture interaction will provide strong technical support for excellence in emerging fields such as smart medical devices, assistive devices for the disabled, smart homes, and smart military operations [ 2 , 3 ]. Most of the current major research work on gesture recognition is focused on machine-vision-based recognition methods, which can pose many limitations in practical applications.…”
Section: Introductionmentioning
confidence: 99%
“…Based on its built-in algorithm, it can automatically identify and track the dynamic skeletal structure of the human body and apply it to the hand to research human gestures. Researchers have two methods of gesture recognition using Kinect: (1) recognition based on the dynamic skeleton of the human body [10]; (2) recognition based on spatial depth sensing [5]. In the first approach, Ren et al [11] obtained the skeleton data of 25 joint points of the human body by Kinect, obtained their coordinates in 3D space in real-time, and investigated the importance of each joint bone in dynamic gesture expression.…”
Section: Introductionmentioning
confidence: 99%
“…Today they find new applications within the Industry 4.0 program. In particular, the Microsoft Kinect family has found applications in many research areas, including human-computer interaction [36][37][38], occupational health [39][40][41], physiotherapy [42][43][44][45], and daily life safety [46]. Furthermore, recently Microsoft released the Azure Kinect DK (also known as k4a) [47].…”
mentioning
confidence: 99%