2022
DOI: 10.1101/2022.10.31.514408
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Reconstructing Computational Dynamics from Neural Measurements with Recurrent Neural Networks

Abstract: Mechanistic and computational models in neuroscience usually take the form of systems of differential or time-recursive equations. The spatio-temporal behavior of such systems is the subject of dynamical systems theory (DST). DST provides a powerful mathematical toolbox for describing and analyzing neurobiological processes at any level, from molecules to behavior, and has been a mainstay of computational neuroscience for decades. Recently, recurrent neural networks (RNNs) became a popular machine learning too… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(17 citation statements)
references
References 183 publications
(272 reference statements)
0
6
0
Order By: Relevance
“…We used trained recurrent neural networks [19, 21, 23]. RNNs can be trained to reproduce both neural activity and behavior [16, 22, 29, 32, 35].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…We used trained recurrent neural networks [19, 21, 23]. RNNs can be trained to reproduce both neural activity and behavior [16, 22, 29, 32, 35].…”
Section: Discussionmentioning
confidence: 99%
“…Specifically, we seek to characterize dynamical systems that could underlie working memory relying on phase coding with neural oscillations. We do so by assuming that cognitive functions can be described by a low-dimensional dynamical system, implemented through populations of neurons (computation through dynamics) [18][19][20][21][22][23], in line with empirical observations [24,25].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, this work also selects common models such as SVM, RNN, CNN (2D CNN), and LSTM for comparative analysis [40,41] TP represents the number of samples correctly predicted as the positive class, TN is the number of samples correctly predicted as the negative class, FP is the number of samples actually belonging to the negative class but wrongly predicted as the positive class, and FN is the number of samples actually belonging to the positive class but wrongly predicted as the negative class.…”
Section: Convolutional Neural Networkmentioning
confidence: 99%
“…We used these gradients to build a 5-d coordinate system that allows us to organize whole-brain maps in a ‘common space’, in which the relative locations within this space provide information regarding the balance of different macroscale systems in a particular context (see also 14 16 ). This analytic approach is focused on how different large-scale systems interact together and so provides a biologically relevant macroscale perspective on brain states 17 that complements perspectives that focus on parcels 18 and large-scale networks 19 (see Konu et al 11 for a region-based analysis of the data used in the current study).…”
Section: Introductionmentioning
confidence: 99%