Proceedings of the Third International ACM Conference on Assistive Technologies 1998
DOI: 10.1145/274497.274531
|View full text |Cite
|
Sign up to set email alerts
|

Head pointing and speech control as a hands-free interface to desktop computing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2002
2002
2017
2017

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(7 citation statements)
references
References 8 publications
0
7
0
Order By: Relevance
“…A text entry facility (S12), which 'predicts' the words a user is entering by looking for the most relevant key combination in its internal dictionary (Minneman 1986, Garay-Vitoria andAbascal 2005) has also been adopted to speed up input. Multi-modal interfaces (S29) which combine a number of modalities such as head movement and speech for motor impaired users (Malkewitz 1998) and handwriting and speech for small device users (Serrano et al 2006) have also been suggested as possible input solutions, and are gaining popularity. Some solutions exist for motor impaired users, but not for small-device users: One-handed keyboards (S6), trackballs (S8), eye-tracking (S10) and switch interfaces (S25) as well as predefined texts or graphical icons (S30).…”
Section: Similar Solutions?mentioning
confidence: 99%
“…A text entry facility (S12), which 'predicts' the words a user is entering by looking for the most relevant key combination in its internal dictionary (Minneman 1986, Garay-Vitoria andAbascal 2005) has also been adopted to speed up input. Multi-modal interfaces (S29) which combine a number of modalities such as head movement and speech for motor impaired users (Malkewitz 1998) and handwriting and speech for small device users (Serrano et al 2006) have also been suggested as possible input solutions, and are gaining popularity. Some solutions exist for motor impaired users, but not for small-device users: One-handed keyboards (S6), trackballs (S8), eye-tracking (S10) and switch interfaces (S25) as well as predefined texts or graphical icons (S30).…”
Section: Similar Solutions?mentioning
confidence: 99%
“…It also allows us to use head gestures as an interaction technique: head movements such as nods or shakes can be used to make selections in the audio space. Head pointing is more common for desktop users with physical disabilities [11], but has many advantages for all users, as head movements are very expressive.…”
Section: Potential Solutionsmentioning
confidence: 99%
“…A range of devices exist for remote pointing but our interest is in computer vision methods that support users in pointer control without the need for mediating devices. Related work generally assumes use of the hands for pointing (e.g., [61,52,5,34,13]) but work in other areas has shown that humans are equally natural at pointing with other parts of their body (literally, from head [45,38] to toe [58]). We reflect this in an approach that is input-agnostic and supports any body movement to be adopted for pointing, contrasting existing systems that are optimised for specific modalities such as tracking of hand gestures [48], head pose [54], or feet [56].…”
Section: Background and Related Workmentioning
confidence: 99%