2018
DOI: 10.1073/pnas.1804643115
|View full text |Cite
|
Sign up to set email alerts
|

Distinct roles of prefrontal and parietal areas in the encoding of attentional priority

Abstract: When searching for an object in a crowded scene, information about the similarity of stimuli to the target object is thought to be encoded in spatial priority maps, which are subsequently used to guide shifts of attention and gaze to likely targets. Two key cortical areas that have been described as holding priority maps are the frontal eye field (FEF) and the lateral intraparietal area (LIP). However, little is known about their distinct contributions in priority encoding. Here, we compared neuronal responses… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

6
21
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(28 citation statements)
references
References 71 publications
6
21
0
Order By: Relevance
“…The probability of any given stimulus being fixated during search was strongly influenced by its similarity to the target (ANOVA; F 3,20 = 182 and F 3,28 = 1414, P < 10 −13 and P < 10 −29 , for monkeys F and J, respectively), which would otherwise be the same (i.e., 1/20) for all stimuli in the search array. Both a same-color (SMC) and a same-shape (SMS) distractor were significantly more likely to be fixated than a no-share (NS) distractor according to post hoc tests ( t -test; SMC vs. NS: t 10 = 13.2 and t 14 = 30.1, P < 10 −6 and P < 10 −13 , SMS vs. NS: t 10 = 21.9 and t 14 = 11.2, P < 10 −9 and P < 10 −7 , for monkeys F and J, respectively), consistent with previous reports of stimulus selection during conjunction searches 15,4045 . Thus, overall, on trials when the two monkeys correctly found the target, their search behavior was remarkably similar in all measures.…”
Section: Resultssupporting
confidence: 90%
“…The probability of any given stimulus being fixated during search was strongly influenced by its similarity to the target (ANOVA; F 3,20 = 182 and F 3,28 = 1414, P < 10 −13 and P < 10 −29 , for monkeys F and J, respectively), which would otherwise be the same (i.e., 1/20) for all stimuli in the search array. Both a same-color (SMC) and a same-shape (SMS) distractor were significantly more likely to be fixated than a no-share (NS) distractor according to post hoc tests ( t -test; SMC vs. NS: t 10 = 13.2 and t 14 = 30.1, P < 10 −6 and P < 10 −13 , SMS vs. NS: t 10 = 21.9 and t 14 = 11.2, P < 10 −9 and P < 10 −7 , for monkeys F and J, respectively), consistent with previous reports of stimulus selection during conjunction searches 15,4045 . Thus, overall, on trials when the two monkeys correctly found the target, their search behavior was remarkably similar in all measures.…”
Section: Resultssupporting
confidence: 90%
“…Correspondingly, fMRI decoding studies have found that directing attention to one feature dimension such as orientation, motion direction or color or to particular values within one given dimension improves the read-out of these features from brain activity in early sensory regions (Kamitani and Tong, 2005; Kamitani and Tong, 2006; Serences and Boynton, 2007; Jehee et al, 2011) but in some cases also in higher-level areas (Liu et al, 2011; Ester et al, 2016). According to one influential account, higher-level fronto-parietal areas such as the lateral intraparietal area (LIP) implement spatial ‘priority maps’ in which the level of activity at individual locations depends jointly on the different features of objects at these locations as well as on top-down factors such as their task relevance, associated reward, etc (Itti and Koch, 2001; Thompson and Bichot, 2005; Gottlieb, 2007; Sapountzis et al, 2018). Independent of spatial priority, LIP neurons have also been found to represent higher-level factors such as learned category membership and other non-spatial information (Freedman and Assad, 2009) and to flexibly switch between encoding of different visual features, such as color or motion, depending on the task (Toth and Assad, 2002; Ibos and Freedman, 2014).…”
Section: Discussionmentioning
confidence: 99%
“…from auditory and visual areas. However, beyond pure associative function, parietal regions have been recently linked especially with prioritizing the focus of attention and cognitive control (Bisley and Goldberg, 2010;Sapountzis et al, 2018). As our task conditions required controlling of attention, it is tempting to think that our results reflect activation of the frontoparietal control network, that has nodes located in the bilateral parietal areas (Cole et al, 2014).…”
Section: Discussionmentioning
confidence: 89%