Proceedings of the 20th International Conference on Human-Computer Interaction With Mobile Devices and Services Adjunct 2018
DOI: 10.1145/3236112.3236116
|View full text |Cite
|
Sign up to set email alerts
|

Predicting stroke gesture input performance for users with motor impairments

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
1
1

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 29 publications
0
4
0
Order By: Relevance
“…An extensive literature exists on assistive technology for users with motor impairments and a variety of computing devices, from desktop PCs [30,31] to tabletops [61], mobile devices [49,60,63,105], and wearables [55][56][57]. This literature has reported user performance with a wide range of input modalities, from touch input [31,38,61] to gesture [13,85,91], voice [18,40,41], eye gaze [16,46,70,102,103], and brain-computer input [28,62]. To mention a few examples, Smart Touch [61] is an accurate template matching technique designed to improve the performance of users with upper body motor impairments when selecting targets on touchscreens; Programming by Voice [40] is an interface that enables users with motor impairments to operate programming environments by speaking the code instead of using the mouse and keyboard; and EyeWrite [46] is a technique designed for eye-based text entry using letter-like gestures [100].…”
Section: Assistive Input Technology For Users With Motor Impairmentsmentioning
confidence: 99%
See 1 more Smart Citation
“…An extensive literature exists on assistive technology for users with motor impairments and a variety of computing devices, from desktop PCs [30,31] to tabletops [61], mobile devices [49,60,63,105], and wearables [55][56][57]. This literature has reported user performance with a wide range of input modalities, from touch input [31,38,61] to gesture [13,85,91], voice [18,40,41], eye gaze [16,46,70,102,103], and brain-computer input [28,62]. To mention a few examples, Smart Touch [61] is an accurate template matching technique designed to improve the performance of users with upper body motor impairments when selecting targets on touchscreens; Programming by Voice [40] is an interface that enables users with motor impairments to operate programming environments by speaking the code instead of using the mouse and keyboard; and EyeWrite [46] is a technique designed for eye-based text entry using letter-like gestures [100].…”
Section: Assistive Input Technology For Users With Motor Impairmentsmentioning
confidence: 99%
“…Based on the observation that people with motor impairments rely on their smartphones to overcome other accessibility challenges in the physical world, the extensive research on making mobile devices more accessible [60,66,85,86,86,91], and research on second-screen television watching [15,21], we believe that designing accessible smartphone apps for interacting with television is a feasible alternative to conventional TV remote controls. Without being constrained by the form factor and button layouts of the TV remote control, more accessible designs can be implemented, such as large-area buttons, fewer buttons, and adaptive layouts to match users' motor abilities following the principles of ability-based design [101]; see the SUPPLE system [33] for a relevant example.…”
Section: Smartphone Inputmentioning
confidence: 99%
“…In this work, we employ the concepts and principles of the Kinematic Theory [73,74] to devise a theoretical model, from which to estimate various statistics and measures of stroke gesture features. While the Kinematic Theory has been used to synthesize and analyze human handwriting successfully in many applied contexts, practical applications in HCI have been primarily directed at generating gesture training data [49,52,88] and estimating gesture production times [51,53,94]. For example, Leiva et al [51] used the ΣΛ model of the Kinematic Theory to develop KeyTime, a very accurate technique for estimating the production times of unistroke gestures.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, researchers have evaluated the characteristics of stroke gestures synthesized with the ΣΛ model from the perspective of both classification performance [49] and similarity to gesture shapes articulated by real users [50]. Furthermore, it has been shown that synthetic stroke gestures are on par with their human counterparts [47] in terms of articulation speed and geometric characteristics [49,64], and that gesture synthesis is successful for various user groups, such as users with low vision [52] or with motor impairments [94,95].…”
Section: Models Of Human Movement Applied To Gesture Researchmentioning
confidence: 99%