2020
DOI: 10.48550/arxiv.2006.01001
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Artificial neural networks for neuroscientists: A primer

Guangyu Robert Yang,
Xiao-Jing Wang

Abstract: Artificial neural networks (ANNs) are essential tools in machine learning that are increasingly used for building computational models in neuroscience. Besides being powerful techniques for data analysis, ANNs provide a new approach for neuroscientists to build models that capture complex behaviors, neural activity and connectivity, as well as to explore optimization in neural systems. In this pedagogical Primer, we introduce conventional ANNs and demonstrate how they have been deployed to study neuroscience q… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 134 publications
(172 reference statements)
0
3
0
Order By: Relevance
“…1a,b). All other temporal epochs are unconstrained, thus obviating the need to generate an error signal continuously throughout trials, which may overly constrain the dynamics 34 (see also Methods for additional details and Supplementary Fig. 1).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…1a,b). All other temporal epochs are unconstrained, thus obviating the need to generate an error signal continuously throughout trials, which may overly constrain the dynamics 34 (see also Methods for additional details and Supplementary Fig. 1).…”
Section: Resultsmentioning
confidence: 99%
“…Further, in the IFC mechanism, most neurons are lightly saturated (Fig. 4c), meaning that most neurons are within a linear regime, as thought to occur in actual neural circuits 34,38 .…”
Section: Discussionmentioning
confidence: 97%
“…LSTMs have proved to be very powerful models and are at the core of most state-of-the-art temporal models (e.g., sentiment analysis in artificial emotional intelligence [24]). Well-rounded guides to artificial neural networks are available [25,26]. More recently, attention-based RNNs were shown to be exceptionally relevant for applications ranging from handwriting synthesis to speech recognition [27].…”
Section: Deep Convolutional and Recurrent Neural Networkmentioning
confidence: 99%