People perceive and conceive of activity in terms of discrete events. Here we propose a theory according to which the perception of boundaries between events arises from ongoing perceptual processing and regulates attention and memory. Perceptual systems continuously make predictions about what will happen next. When transient errors in predictions arise, an event boundary is perceived. According to the theory, the perception of events depends on both sensory cues and knowledge structures that represent previously learned information about event parts and inferences about actors' goals and plans. Neurological and neurophysiological data suggest that representations of events may be implemented by structures in the lateral prefrontal cortex and that perceptual prediction error is calculated and evaluated by a processing pathway including the anterior cingulate cortex and subcortical neuromodulatory systems.
Trial-to-trial variability in the blood oxygen level-dependent (BOLD) response of functional magnetic resonance imaging has been shown to be relevant to human perception and behavior, but the sources of this variability remain unknown. We demonstrate that coherent spontaneous fluctuations in human brain activity account for a significant fraction of the variability in measured event-related BOLD responses and that spontaneous and task-related activity are linearly superimposed in the human brain.
People make sense of continuous streams of observed behavior in part by segmenting them into events. Event segmentation seems to be an ongoing component of everyday perception. Events are segmented simultaneously at multiple timescales, and are grouped hierarchically. Activity in brain regions including the posterior temporal and parietal cortex and lateral frontal cortex increases transiently at event boundaries. The parsing of ongoing activity into events is related to the updating of working memory, to the contents of long-term memory, and to the learning of new procedures. Event segmentation might arise as a side effect of an adaptive mechanism that integrates information over the recent past to improve predictions about the near future. Making sense by segmentingImagine walking with a friend to a coffee shop. If asked to describe this activity in more detail you might list a few of the events that make it up. The events listed could be broken up by changes in the physical features of the activity, such as location: 'We started out by going down to the laboratory. We grabbed our coats and put them on. Then we walked out of the building to the corner by the subway station…' Or, they could be broken up by changes in conceptual features, such as your goals: 'We started our walk talking about how much construction is going on. When the topic turned to the new building with the coffee shop we decided to head over there to give it a try…' Such descriptions are typical of how people talk about events, and they illustrate something important about perception: people make sense of a complex dynamic world in part by segmenting it into a modest number of meaningful units. Recent research on event perception reveals that, as an ongoing part of normal perception, people segment activity into events and subevents. This segmentation is related to core functions of cognitive control and memory encoding, and is subserved by isolable neural mechanisms. Events and their boundariesBy 'event' we mean a segment of time at a given location that is conceived by an observer to have a beginning and an end [1]. In particular we focus on the events that make up everyday life on the timescale of a few seconds to tens of minutes -things like opening an envelope, pouring coffee into a cup, changing the diaper of a baby or calling a friend on the phone. Event Segmentation Theory (EST) [2] (see Glossary) proposes that perceptual systems spontaneously segment activity into events as a side effect of trying to anticipate upcoming information (see Box 1). When perceptual or conceptual features of the activity change, prediction becomes more difficult and errors in prediction increase transiently. At such points, people update memory representations of 'what is happening now'. The processing cascade of detecting a
How do people perceive routine events, such as making a bed, as these events unfold in time? Research on knowledge structures suggests that people conceive of events as goal-directed partonomic hierarchies. Here, participants segmented videos of events into coarse and fine units on separate viewings; some described the activity of each unit as well. Both segmentation and descriptions support the hierarchical bias hypothesis in event perception: Observers spontaneously encoded the events in terms of partonomic hierarchies. Hierarchical organization was strengthened by simultaneous description and, to a weaker extent, by familiarity. Describing from memory rather than perception yielded fewer units but did not alter the qualitative nature of the descriptions. Although the descriptions were telegraphic and without communicative intent, their hierarchical structure was evident to naive readers. The data suggest that cognitive schemata mediate between perceptual and functional information about events and indicate that these knowledge structures may be organized around object/action units.
Events can be understood in terms of their temporal structure. The authors first draw on several bodies of research to construct an analysis of how people use event structure in perception, understanding, planning, and action. Philosophy provides a grounding for the basic units of events and actions. Perceptual psychology provides an analogy to object perception: Like objects, events belong to categories, and, like objects, events have parts. These relationships generate 2 hierarchical organizations for events: taxonomies and partonomies. Event partonomies have been studied by looking at how people segment activity as it happens. Structured representations of events can relate partonomy to goal relationships and causal structure; such representations have been shown to drive narrative comprehension, memory, and planning. Computational models provide insight into how mental representations might be organized and transformed. These different approaches to event structure converge on an explanation of how multiple sources of information interact in event perception and conception.
Temporal structure has a major role in human understanding of everyday events. Observers are able to segment ongoing activity into temporal parts and sub-parts that are reliable, meaningful and correlated with ecologically relevant features of the action. Here we present evidence that a network of brain regions is tuned to perceptually salient event boundaries, both during intentional event segmentation and during naive passive viewing of events. Activity within this network may provide a basis for parsing the temporally evolving environment into meaningful units.
Mental rotation is a hypothesized imagery process that has inspired controversy regarding the substrate of human spatial reasoning. Two central questions about mental rotation remain: Does mental rotation depend on analog spatial representations, and does mental rotation depend on motor simulation? A review and meta-analysis of neuroimaging studies help answer these questions. Mental rotation is accompanied by increased activity in the intraparietal sulcus and adjacent regions. These areas contain spatially mapped representations, and activity in these areas is modulated by parametric manipulations of mental rotation tasks, supporting the view that mental rotation depends on analog representations. Mental rotation also is accompanied by activity in the medial superior precentral cortex, particularly under conditions that favor motor simulation, supporting the view that mental rotation depends on motor simulation in some situations. The relationship between mental rotation and motor simulation can be understood in terms of how these two processes update spatial reference frames.
When reading a story or watching a film, comprehenders construct a series of representations in order to understand the events depicted. Discourse comprehension theories and a recent theory of perceptual event segmentation both suggest that comprehenders monitor situational features such as characters' goals, to update these representations at natural boundaries in activity. However, the converging predictions of these theories had previously not been tested directly. Two studies provided evidence that changes in situational features such as characters, their locations, their interactions with objects, and their goals are related to the segmentation of events in both narrative texts and films. A 3rd study indicated that clauses with event boundaries are read more slowly than are other clauses and that changes in situational features partially mediate this relation. A final study suggested that the predictability of incoming information influences reading rate and possibly event segmentation. Taken together, these results suggest that processing situational changes during comprehension is an important determinant of how one segments ongoing activity into events and that this segmentation is related to the control of processing during reading.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.