The human cortex exhibits intrinsic neural timescales that shape a temporal hierarchy. Whether this temporal hierarchy follows the spatial hierarchy of its topography, namely the core-periphery organization, remains an open issue. Using magnetoencephalography data, we investigate intrinsic neural timescales during rest and task states; we measure the autocorrelation window in short (ACW-50) and, introducing a novel variant, long (ACW-0) windows. We demonstrate longer ACW-50 and ACW-0 in networks located at the core compared to those at the periphery with rest and task states showing a high ACW correlation. Calculating rest-task differences, i.e., subtracting the shared core-periphery organization, reveals task-specific ACW changes in distinct networks. Finally, employing kernel density estimation, machine learning, and simulation, we demonstrate that ACW-0 exhibits better prediction in classifying a region’s time window as core or periphery. Overall, our findings provide fundamental insight into how the human cortex’s temporal hierarchy converges with its spatial core-periphery hierarchy.
We process and integrate multiple timescales into one meaningful whole. Recent evidence suggests that the brain displays a complex multiscale temporal organization. Different regions exhibit different timescales as described by the concept of intrinsic neural timescales (INT); however, their function and neural mechanisms remains unclear. We review recent literature on INT and propose that they are key for input processing. Specifically, they are shared across different species, i.e., input sharing. This suggests a role of INT in encoding inputs through matching the inputs’ stochastics with the ongoing temporal statistics of the brain’s neural activity, i.e., input encoding. Following simulation and empirical data, we point out input integration versus segregation and input sampling as key temporal mechanisms of input processing. This deeply grounds the brain within its environmental and evolutionary context. It carries major implications in understanding mental features and psychiatric disorders, as well as going beyond the brain in integrating timescales into artificial intelligence.
The brain shows a topographical hierarchy along the lines of lower- and higher-order networks. The exact temporal dynamics characterization of this lower-higher-order topography at rest and its impact on task states remains unclear, though. Using 2 functional magnetic resonance imaging data sets, we investigate lower- and higher-order networks in terms of the signal compressibility, operationalized by Lempel–Ziv complexity (LZC). As we assume that this degree of complexity is related to the slow–fast frequency balance, we also compute the median frequency (MF), an estimation of frequency distribution. We demonstrate (i) topographical differences at rest between higher- and lower-order networks, showing lower LZC and MF in the former; (ii) task-related and task-specific changes in LZC and MF in both lower- and higher-order networks; (iii) hierarchical relationship between LZC and MF, as MF at rest correlates with LZC rest–task change along the lines of lower- and higher-order networks; and (iv) causal and nonlinear relation between LZC at rest and LZC during task, with MF at rest acting as mediator. Together, results show that the topographical hierarchy of lower- and higher-order networks converges with their temporal hierarchy, with these neural dynamics at rest shaping their range of complexity during task states in a nonlinear way.
The brain exhibits both spatial and temporal hierarchy with their relationship remaining an open question. We address this issue by investigating the brain's spatial hierarchy with complexity, i.e., Lempel-Zev Complexity (LZC) and temporal dynamics, i.e., median frequency (MF) in rest/task fMRI (including replication data). Our results are: (I) topographical differences in rest between higher-order networks (lower LZC and MF) and lowerorder networks (higher LZC and MF); (II) task-specific increases and task-unspecific decreases in LZC and MF; (III) non-linear topographical relationship with low MF mediating higher LZC rest-task changes as confirmed in various simulations. Together, we demonstrate convergence of spatial (LZC) and temporal (MF) hierarchies in a non-linear topographical way along the lines of higher-order/slow frequency and lowerorder/fast frequency networks.The main aim of our study was to investigate the convergence of spatial and temporal topographies including how that shapes rest and task states. For that purpose, we used the Human Connectome Project (HCP) 7 Tesla fMRI datasets (and the HCP 3T for replication) applying novel measures like Lempel-Ziv complexity (LZC) and median frequency (MF) during different states, i.e., rest and two different task states with two completely different complexity and temporal structure (i.e., movie and retinotopy) (see below).LZC measures the number of distinct patterns in a binary sequence i.e., the regularity or repetitiveness of a signal 34,35 , which reflects the amount of information (number of bits) required to reconstruct a signal 34 . Recently, LZC has been applied in fMRI 36-39 as well as in other imaging modalities including MEG 40-43 and wn spatial 2,43,56 , we h rest and 1,2,12,32,33 , i.e., more toparietalThe second specific aim was to investigate regional temporal dynamics by using median frequency (MF) of ISF in rest and task states. Recent studies demonstrated that lower-and higher-order regions exhibit different intrinsic neural time scales with short and long durations in their temporal receptive windows during task states 7,8,[17][18][19]32,33,[9][10][11][12][13][14][15][16] . Based on these findings and the fact that the duration of intrinsic neural time scale may be related to the length of cycle durations 57 as reflecting the frequency range 7,8 , we hypothesized that lower-order networks would exhibit higher MF, i.e., stronger power in faster ISF frequency ranges with shorter cycle duration. While we assumed that higher-order ones would show lower MF with stronger power in slower ISF frequency ranges, i.e., longer cycle durations, in both rest and task states.The third specific aim consisted in directly linking spatial, i.e., LZC, and temporal, i.e., MF dimensions including their respective topographies. For that purpose, we conducted correlation and regression models as well as simulation. Given the scale-free driven discrepancy in power of slower (strong power) and faster (weaker power) ISF frequency ranges 7,13,58,59 , we assumed non-l...
A.AbstractThe standard approach in neuroscience research infers from the external stimulus (outside) to the brain (inside) through stimulus-evoked activity. Recently challenged by Buzsáki, he advocates the reverse; an inside-out approach inferring from the brain’s activity to the neural effects of the stimulus. If so, stimulus-evoked activity should be a hybrid of internal and external components. Providing direct evidence for this hybrid nature, we measured human intracranial stereo-electroencephalography (sEEG) to investigate how prestimulus variability, i.e., standard deviation, shapes poststimulus activity through trial-to-trial variability. We first observed greater poststimulus variability quenching in trials exhibiting high prestimulus variability. Next, we found that the relative effect of the stimulus was higher in the later (300-600ms) than the earlier (0-300ms) poststimulus period. These results were extended by our Deep Learning LSTM network models at the single trial level. The accuracy to classify single trials (prestimulus low/high) increased greatly when the models were trained and tested with real trials compared to trials that exclude the effects of the prestimulus-related ongoing dynamics (corrected trials). Lastly, we replicated our findings showing that trials with high prestimulus variability in theta and alpha bands exhibits faster reaction times. Together, our results support the inside-out approach by demonstrating that stimulus-related activity is a hybrid of two factors: 1) the effects of the external stimulus itself, and 2) the effects of the ongoing dynamics spilling over from the prestimulus period, with the second, i.e., the inside, dwarfing the influence of the first, i.e., the outside.B.Significance StatementOur findings signify a significant conceptual advance in the relationship between pre- and poststimulus dynamics in humans. These findings are important as they show that we miss an essential component - the impact of the ongoing dynamics - when restricting our analyses to the effects of the external stimulus alone. Consequently, these findings may be crucial to fully understand higher cognitive functions and their impairments, as can be seen in psychiatric illnesses. In addition, our Deep Learning LSTM models show a second conceptual advance: high classification accuracy of a single trial to its prestimulus state. Finally, our replicated results in an independent dataset and task showed that this relationship between pre- and poststimulus dynamics exists across tasks and is behaviorally relevant.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.