2012
DOI: 10.1177/0018720812450587
|View full text |Cite
|
Sign up to set email alerts
|

Spearcons (Speech-Based Earcons) Improve Navigation Performance in Advanced Auditory Menus

Abstract: Spearcons have broadened the taxonomy of nonspeech auditory cues. Users can benefit from the application of spearcons in real devices.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
108
0
5

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 110 publications
(117 citation statements)
references
References 35 publications
(36 reference statements)
2
108
0
5
Order By: Relevance
“…Additionally, the issue might also lie within the implementation of the swiping action itself, which might not be ideal for speedy performance. An alternative might be to allow for the swiping action to be initiated at any point on the touch screen rather than only at the top of the list -which would be similar to the approach used in VoiceOver where navigation between adjacent on screen objects can be performed by swiping gestures irrespective of where those gestures are performed on the touch screen -and to provide a quick overview of such a list, for example by exploiting the use of spearcons (Walker et al 2006). While this might impact orientation within the menu, it could allow for faster exploration and browsing time.…”
Section: Discussionmentioning
confidence: 99%
“…Additionally, the issue might also lie within the implementation of the swiping action itself, which might not be ideal for speedy performance. An alternative might be to allow for the swiping action to be initiated at any point on the touch screen rather than only at the top of the list -which would be similar to the approach used in VoiceOver where navigation between adjacent on screen objects can be performed by swiping gestures irrespective of where those gestures are performed on the touch screen -and to provide a quick overview of such a list, for example by exploiting the use of spearcons (Walker et al 2006). While this might impact orientation within the menu, it could allow for faster exploration and browsing time.…”
Section: Discussionmentioning
confidence: 99%
“…Walker and his colleagues tried to come up with hybrids integrating auditory icons and earcons [67]. Speech, spearcons, and spindex cues have also been used together in a serial manner [68].…”
Section: Hybrid Solutionsmentioning
confidence: 99%
“…For example, Clara Rockmore's "aerial positions" for playing the Theremin might be considered gestures using one of the above definitions, but this is more like a gesture system in that it is a set of hand positions used in a formally structured manner. These gesture systems might better describe how gestures are used in computing science, but this definition might take away the natural interaction that gestures originally sought to provide [18]. Some researchers argue that gestures cannot be examined without the linguistic context where they occur [12], but multimodal systems have both incorporated speech [1] and used gestures on their own [9].…”
Section: Understanding Gesturesmentioning
confidence: 99%
“…We tested the technique using a menu navigation task over a two level hierarchical menu where the root nodes represent common tasks performed on a mobile device. As well as a visual representation, the menu item name was read out as the user moves over an item using high tempo speech, known as spearcons [18], to present the audio. The menus were cyclical such that the currently selected item loops at the bottom and the top of the menus.…”
Section: Testing Body-based Discrete Action Event Controlmentioning
confidence: 99%