Much indirect evidence supports the hypothesis that transformations of mental images are at least in part guided by motor processes, even in the case of images of abstract objects rather than of body parts. For example, rotation may be guided by processes that also prime one to see results of a specific motor action. We directly test the hypothesis by means of a dual-task paradigm in which subjects perform the Cooper-Shepard mental rotation task while executing an unseen motor rotation in a given direction and at a previously-learned speed. Four results support the inference that mental rotation relies on motor processes. First, motor rotation that is compatible with mental rotation results in faster times and fewer errors in the imagery task than when the two rotations are incompatible. Second, the angle through which subjects rotate their mental images, and the angle through which they rotate a joystick handle are correlated, but only if the directions of the two rotations are compatible. Third, motor rotation modifies the classical inverted V-shaped mental rotation response time function, favoring the direction of the motor rotation; indeed, in some cases motor rotation even shifts the location of the minimum of this curve in the direction of the motor rotation. Fourth, the preceding effect is sensitive not only to the direction of the motor rotation, but also to the motor speed. A change in the speed of motor rotation can correspondingly slow down or speed up the mental rotation.
An outstanding challenge for consciousness research is to characterize the neural signature of conscious access independently of any decisional processes. Here we present a model-based approach that uses inter-trial variability to identify the brain dynamics associated with stimulus processing. We demonstrate that, even in the absence of any task or behavior, the electroencephalographic response to auditory stimuli shows bifurcation dynamics around 250–300 milliseconds post-stimulus. Namely, the same stimulus gives rise to late sustained activity on some trials, and not on others. This late neural activity is predictive of task-related reports, and also of reports of conscious contents that are randomly sampled during task-free listening. Source localization further suggests that task-free conscious access recruits the same neural networks as those associated with explicit report, except for frontal executive components. Studying brain dynamics through variability could thus play a key role for identifying the core signatures of conscious access, independent of report.
Perceptual aftereffects provide a sensitive tool to investigate the influence of eye and head position on visual processing. There have been recent indications that the TAE is remapped around the time of a saccade to remain aligned to the adapting location in the world. Here, we investigate the spatial frame of reference of the TAE by independently manipulating retinal position, gaze orientation, and head orientation between adaptation and test. The results show that the critical factor in the TAE is the correspondence between the adaptation and test locations in a retinotopic frame of reference, whereas world- and head-centric frames of reference do not play a significant role. Our results confirm that adaptation to orientation takes place at retinotopic levels of visual processing. We suggest that the remapping process that plays a role in visual stability does not transfer feature gain information around the time of eye (or head) movements.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.