2013
DOI: 10.1007/978-3-642-39330-3_34
|View full text |Cite
|
Sign up to set email alerts
|

Study on Character Input Methods Using Eye-gaze Input Interface

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
2
2
2

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 9 publications
0
4
0
Order By: Relevance
“…For instance, an evaluation of two approaches—controlling the application using OptiKey to emulate the mouse and customizing interfaces for eye‐movement interaction—was conducted on Twitter (Kumar et al, 2017). Murata et al (2013) further compared four Japanese character input methods via fixation on the input interface in terms of input speed, accuracy, simplicity, and fatigue. Chen and Lin (2018) developed an eye‐movement method via fixation to operate a smart television based on expert evaluations and provided design specifications for eye‐control interfaces.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…For instance, an evaluation of two approaches—controlling the application using OptiKey to emulate the mouse and customizing interfaces for eye‐movement interaction—was conducted on Twitter (Kumar et al, 2017). Murata et al (2013) further compared four Japanese character input methods via fixation on the input interface in terms of input speed, accuracy, simplicity, and fatigue. Chen and Lin (2018) developed an eye‐movement method via fixation to operate a smart television based on expert evaluations and provided design specifications for eye‐control interfaces.…”
Section: Introductionmentioning
confidence: 99%
“…According to the literature on the interface design of eye‐control systems, several researchers have used this technology for secondary development and application and conducted ergonomic evaluation experiments. However, these studies were conducted in specific areas, with strong pertinence but weak universality (Murata et al, 2013; Panwar et al, 2012). Although certain researchers have conducted quantitative research on the size and spacing of controls (Shen et al, 2003) and operating area (Komogortsev et al, 2011), these results are relatively outdated.…”
Section: Introductionmentioning
confidence: 99%
“…Eye-gaze-based human-computer interaction techniques enable users to point to targets more quickly than they can with a computer mouse [1][2][3][4][5][6][7][8][9][10][11][12][13]. Previous studies have encompassed a variety of human-computer interaction tasks, such as clicks [11,14], menu selection [15], and character input [16]. Faster target acquisition has been reported for an eye-gaze input system with short dwell times of 150 ms [2,3].…”
Section: Introductionmentioning
confidence: 99%
“…Such conditions reduce the user's ability to interact with people and machines while the effectiveness of traditional assitive technologies (e.g., mouth stick) diminishes with the severity or stage of the condition. Eye tracking systems have been shown to be effective with patients suffering from these conditions (Murata et al 2012), however most regulatory and technological efforts largely focus on visually impaired users. Popular front-end frameworks such as Twitter's Bootstrap and Zurb's Foundation adhere to and support the A11Y project -"a community driven effort to make web accessibility easier" (The Accessibility Project .…”
Section: Introductionmentioning
confidence: 99%