2020
DOI: 10.2139/ssrn.3757075
|View full text |Cite
|
Sign up to set email alerts
|

Primate Neuronal Connections are Sparse as Compared to Mouse

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…The DMC task that the models were trained to perform was tailored to match that used in experiments: the task sequence and motion directions were the same as in the monkey experiments, and the networks were required to indicate whether the sample and test stimuli belonged to the same category. Network parameters (recurrent weights/biases and output weights) were optimized to minimize a loss function with three parts: (1) one related to performance of the DMC task (cross-entropy of the network’s generated outputs with respect to the correct outputs), (2) a metabolic cost on firing rates ( Harris et al, 2012 ), and (3) a metabolic cost on connectivity ( Wildenberg et al, 2020 ) (see details in Materials and methods). After training, 41 of 50 networks converged to perform the DMC task with high accuracy (99.9% ± 0.0009%), which were therefore included in the following analysis.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The DMC task that the models were trained to perform was tailored to match that used in experiments: the task sequence and motion directions were the same as in the monkey experiments, and the networks were required to indicate whether the sample and test stimuli belonged to the same category. Network parameters (recurrent weights/biases and output weights) were optimized to minimize a loss function with three parts: (1) one related to performance of the DMC task (cross-entropy of the network’s generated outputs with respect to the correct outputs), (2) a metabolic cost on firing rates ( Harris et al, 2012 ), and (3) a metabolic cost on connectivity ( Wildenberg et al, 2020 ) (see details in Materials and methods). After training, 41 of 50 networks converged to perform the DMC task with high accuracy (99.9% ± 0.0009%), which were therefore included in the following analysis.…”
Section: Resultsmentioning
confidence: 99%
“…The output layer (fixation, match, and nonmatch units) used the softmax activation function, which scales output unit activities to generate a probability distribution over output values at each time point. Network parameters (recurrent weights/biases and output weights) were optimized to minimize a loss function with three parts: (1) a performance loss, given by the categorical cross-entropy between the desired vs. actual output activities, which pushes networks to perform the DMC task at a high level of accuracy; (2) a metabolic cost on firing rates (mean neuronal activity), which pushes networks to solve the task without using firing rates that are pathologically high ( Harris et al, 2012 ); and (3) a metabolic cost on connectivity (mean synaptic weight), to reflect the costliness of maintaining synaptic connections in vivo ( Wildenberg et al, 2020 ).…”
Section: Methodsmentioning
confidence: 99%