2013
DOI: 10.1145/2514856
|View full text |Cite
|
Sign up to set email alerts
|

Performing Locomotion Tasks in Immersive Computer Games with an Adapted Eye-Tracking Interface

Abstract: Young people with severe physical disabilities may benefit greatly from participating in immersive computer games. In-game tasks can be fun, engaging, educational, and socially interactive. But for those who are unable to use traditional methods of computer input such as a mouse and keyboard, there is a barrier to interaction that they must first overcome. Eye-gaze interaction is one method of input that can potentially achieve the levels of interaction required for these games. How we use eye-gaze or the gaze… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(4 citation statements)
references
References 30 publications
0
4
0
Order By: Relevance
“…Vickers et al [40] showed the possibility to use eye gestures as game inputs; Christian et al [8] provided novel techniques for users to interact with games by head-gesture; Harada et al [18] and Sporka et al [36] both indicated that the voice input greatly expanded the scope of games that could be played hands-free and just counted on voice input; Baba et al [4] presented a game prototype which treated skin contact as controller input; Nacke et al [30] even considered using biofeedback (including EMG, EDA, EKG, RESP, TEMP) as game input methods; Hsu et al [21] compared different game inputs, including head gestures, voice control, handheld controller, joysticks, eye winking and glass touchpad, for First-Person Shooter(FPS) games on smart glasses.…”
Section: Game Inputmentioning
confidence: 99%
“…Vickers et al [40] showed the possibility to use eye gestures as game inputs; Christian et al [8] provided novel techniques for users to interact with games by head-gesture; Harada et al [18] and Sporka et al [36] both indicated that the voice input greatly expanded the scope of games that could be played hands-free and just counted on voice input; Baba et al [4] presented a game prototype which treated skin contact as controller input; Nacke et al [30] even considered using biofeedback (including EMG, EDA, EKG, RESP, TEMP) as game input methods; Hsu et al [21] compared different game inputs, including head gestures, voice control, handheld controller, joysticks, eye winking and glass touchpad, for First-Person Shooter(FPS) games on smart glasses.…”
Section: Game Inputmentioning
confidence: 99%
“…However, there is a lot of research on gaze gestures demonstrating its potential for various scenarios and tasks. For example, gaze gestures can be used for text entry (Wobbrock et al 2008), computer control (Porta and Turina 2008), gaming (Vickers, Istance, and Hyrskykari 2013) or drawing (Heikkilä 2013). People can also browse objects on a remote smart screen by gazing sideways (Zhang, Bulling, and Gellersen 2013), or select moving objects by matching their gaze movement with the movement of the object (Vidal et al 2013).…”
Section: Previous Workmentioning
confidence: 99%
“…This technology holds the potential to unlock people's capacity to participate in leisure and productivity pursuits, play games, listen to music, use social media, and for environmental control. For individuals with communication difficulties, it can also offer access to specialist software for augmentative and alternative communication (AAC) [8,[10][11][12][13][14][15][16][17]. Eyegaze control technology involves an infra-red camera, which works in conjunction with specialised software, to monitor and respond to a user's eye movements [18].…”
Section: Introductionmentioning
confidence: 99%