2020
DOI: 10.3758/s13414-020-02061-8
|View full text |Cite
|
Sign up to set email alerts
|

Extending the study of visual attention to a multisensory world (Charles W. Eriksen Special Issue)

Abstract: Charles W. Eriksen (1923–2018), long-time editor of Perception & Psychophysics (1971–1993) – the precursor to the present journal – undoubtedly made a profound contribution to the study of selective attention in the visual modality. Working primarily with neurologically normal adults, his early research provided both theoretical accounts for behavioral phenomena as well as robust experimental tasks, including the well-known Eriksen flanker task. The latter paradigm has been used and adapted by many researc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 133 publications
0
7
0
Order By: Relevance
“…In Jensen et al's (2019a) study, this was the case when the target was presented at 1 To the best of our knowledge, no distractor processing study investigating the influence of HOC has used auditory stimuli (for the discussion of one unpublished experiment in this area, see Spence et al, 2017). In fact, although some distractor processing studies exist which have used auditory stimuli (e.g., Chan et al, 2005;Frings & Spence, 2010;Ulrich, Prislan & Miller, 2020; for a discussion, see Frings, Schneider & Moeller, 2014), far more visual and / or tactile distractor processing studies have been published (e.g., Driver & Grossenbacher, 1996;Eriksen & Eriksen, 1984, for reviews, see Merz et al, 2020;Spence, 2020;Wesslein et al, 2014). Subsequently, studies investigating the influence of HOC on distractor processing have used visual and / or tactile experimental set-ups.…”
Section: Introductionmentioning
confidence: 99%
“…In Jensen et al's (2019a) study, this was the case when the target was presented at 1 To the best of our knowledge, no distractor processing study investigating the influence of HOC has used auditory stimuli (for the discussion of one unpublished experiment in this area, see Spence et al, 2017). In fact, although some distractor processing studies exist which have used auditory stimuli (e.g., Chan et al, 2005;Frings & Spence, 2010;Ulrich, Prislan & Miller, 2020; for a discussion, see Frings, Schneider & Moeller, 2014), far more visual and / or tactile distractor processing studies have been published (e.g., Driver & Grossenbacher, 1996;Eriksen & Eriksen, 1984, for reviews, see Merz et al, 2020;Spence, 2020;Wesslein et al, 2014). Subsequently, studies investigating the influence of HOC on distractor processing have used visual and / or tactile experimental set-ups.…”
Section: Introductionmentioning
confidence: 99%
“…From a crossmodal/multisensory perspective, the present study stands in a long line of research showing crossmodal influences of a seemingly irrelevant stimulus in one sensory modality on a task-relevant stimulus in another (e.g., the crossmodal congruency task; for reviews, see Spence, 2020;Spence et al, 2008). In the present study, we took a closer look at the way in which this seemingly irrelevant stimulus is processed.…”
Section: Discussionmentioning
confidence: 92%
“…They are also consistent with the role of task difficulty (which can be also determined by factors other than working memory load) as an obvious yet neglected performance modulator (Lisi, Bonato, & Zorzi, 2015 ). Evidence derived from non-visual domains is crucial for testing the extent to which the effects found within the prevalently studied visuospatial attention domain are not modality-specific but rather reflect general characteristics of attentional functioning (Spence, 2020 ). Importantly, our findings suggest that this paradigm can be used to study processing of speech stimuli in the presence of background which is crucial for the populations with impaired speech recognition in noise.…”
Section: Discussionmentioning
confidence: 99%