2021
DOI: 10.1073/pnas.2021905118
|View full text |Cite
|
Sign up to set email alerts
|

Neural signatures of attentional engagement during narratives and its consequences for event memory

Abstract: As we comprehend narratives, our attentional engagement fluctuates over time. Despite theoretical conceptions of narrative engagement as emotion-laden attention, little empirical work has characterized the cognitive and neural processes that comprise subjective engagement in naturalistic contexts or its consequences for memory. Here, we relate fluctuations in narrative engagement to patterns of brain coactivation and test whether neural signatures of engagement predict subsequent memory. In behavioral studies,… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

8
122
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 69 publications
(159 citation statements)
references
References 102 publications
8
122
1
Order By: Relevance
“…Next, to generate the null distribution, the TFs of the two conditions were scrambled 1000 times, and the difference between the two conditions was calculated for each iteration. Finally, the observed difference was compared to the generated null distribution in order to calculate the p-value for each pixel of TF difference 55 . Then, the pixels that had a p-value lower than 0.01 were considered to show a significant difference between the two conditions.…”
Section: Methodsmentioning
confidence: 99%
“…Next, to generate the null distribution, the TFs of the two conditions were scrambled 1000 times, and the difference between the two conditions was calculated for each iteration. Finally, the observed difference was compared to the generated null distribution in order to calculate the p-value for each pixel of TF difference 55 . Then, the pixels that had a p-value lower than 0.01 were considered to show a significant difference between the two conditions.…”
Section: Methodsmentioning
confidence: 99%
“…We correlated 1) the mean between-movie boundary pattern during recall and 2) the mean within-movie event boundary pattern during encoding, in PMC (Figure 4, 'Event offset' condition). Surprisingly, the two were negatively correlated (t(14) = 5.10, p < .001, Cohen's d z = 1.32), suggesting that the between-movie boundary pattern may reflect a cognitive state qualitatively different from the state elicited by event boundaries during movie watching (e.g., attentional engagement; Song et al, 2021).…”
Section: Figure 2 Consistent Activation Patterns Associated With Between-movie Boundaries (A) Schematicmentioning
confidence: 97%
“…Future studies may consider combining our paradigm with a continuous measure of engagement (cf. Song et al, 2021).…”
Section: Relating Inter-subject Correlation With Behavioral Measures Of Engagementmentioning
confidence: 99%
“…Naturalistic materials such as movies or spoken stories lead to synchronized patterns of neural activity across individuals that scale with the degree of engagement (Hasson et al, 2010;Nastase et al, 2019;Nguyen et al, 2019;Yeshurun et al, 2017). The strength of synchrony across individuals, quantified as inter-subject correlation (ISC; Hasson et al, 2004Dmochowski et al, 2012Dmochowski et al, , 2014, is stronger when stimuli are captivating or exciting (Hasson et al, 2010;Schmälzle et al, 2015), and is predictive of behavioral measures reflecting engagement (Cohen et al, 2017;Dikker et al, 2017;Dmochowski et al, 2014;Poulsen et al, 2017;Song et al, 2021) and recall of the materials (Chan et al, 2019;Cohen et al, 2018;Cohen & Parra, 2016;Davidesco et al, 2019;Hasson, Furman, et al, 2008;Piazza et al, 2021;Song et al, 2021;Stephens et al, 2010). Conversely, ISC is reduced when individuals do not attend to naturalistic materials (Cohen et al, 2018;Ki et al, 2016;Kuhlen et al, 2012;Rosenkranz et al, 2021) or when stimuli are unstructured or temporally scrambled (Dmochowski et al, 2012;Hasson, Yang, et al, 2008).…”
Section: Introductionmentioning
confidence: 99%