2014
DOI: 10.1177/1541931214581249
|View full text |Cite
|
Sign up to set email alerts
|

Performance of a Sonification Task in the Presence of Verbal, Visuospatial, and Auditory Interference Tasks

Abstract: An experiment examined performance with sonifications-a general term for nonspeech auditory displays-as a function of working memory encoding and the demands of three different types of interference tasks. Participants encoded the sonifications as verbal representations, visuospatial images, or auditory images. After encoding, participants engaged in brief verbal, visuospatial, or auditory interference tasks before responding to point estimation queries about the sonifications. Results were expected to show se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2018
2018

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 21 publications
0
1
0
Order By: Relevance
“…Dubus and Bresin, 2013), first as a tool to assist visually-impaired analysts to explore data (e.g., Zhao et al, 2008), as well as an option when visual attention must be devoted to other tasks (e.g., Nees and Walker, 2014;Compare, 2017). The emergence of data sonification as an analytical approach is particularly well-suited for time-related tasks, such as monitoring, synchronization or understanding motion of multiple objects (Dubus and Bresin, 2013).…”
Section: Introductionmentioning
confidence: 99%
“…Dubus and Bresin, 2013), first as a tool to assist visually-impaired analysts to explore data (e.g., Zhao et al, 2008), as well as an option when visual attention must be devoted to other tasks (e.g., Nees and Walker, 2014;Compare, 2017). The emergence of data sonification as an analytical approach is particularly well-suited for time-related tasks, such as monitoring, synchronization or understanding motion of multiple objects (Dubus and Bresin, 2013).…”
Section: Introductionmentioning
confidence: 99%