Biomimetics 2021
DOI: 10.5772/intechopen.93435
|View full text |Cite
|
Sign up to set email alerts
|

Brain-Inspired Spiking Neural Networks

Abstract: Brain is a very efficient computing system. It performs very complex tasks while occupying about 2 liters of volume and consuming very little energy. The computation tasks are performed by special cells in the brain called neurons. They compute using electrical pulses and exchange information between them through chemicals called neurotransmitters. With this as inspiration, there are several compute models which exist today trying to exploit the inherent efficiencies demonstrated by nature. The compute models … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 59 publications
0
8
0
Order By: Relevance
“…This was achievable using the Leaky Integrate‐and‐Fire (LIF) model, a mathematical representation of a neuron that was trained to perform the task. The LIF model is commonly used due to its simplicity and computational efficiency while attempting to mimic biology (see, e.g., Ahmed, 2020, for more on neuron models).…”
Section: Methodsmentioning
confidence: 99%
“…This was achievable using the Leaky Integrate‐and‐Fire (LIF) model, a mathematical representation of a neuron that was trained to perform the task. The LIF model is commonly used due to its simplicity and computational efficiency while attempting to mimic biology (see, e.g., Ahmed, 2020, for more on neuron models).…”
Section: Methodsmentioning
confidence: 99%
“…In order to implement bio-inspired memory models based on the hippocampus, third-generation of neural networks, i.e., SNNs, were used. This kind of networks consists of interconnected neuron models that are not only inspired by biology, but also attempt to mimic their biological counterparts, in order to incorporate the neurocomputational capabilities found in nature [5].…”
Section: A Spiking Neural Networkmentioning
confidence: 99%
“…This aspect makes SNNs very efficient from a computational point of view. Moreover, SNNs present a distributed learning process thanks to the use of Spike-timing-dependent plasticity (STDP) [25], which obtains the necessary information from the electrical impulses encoded between local neurons to define the weight of the synapses, i.e., to regulate the learning of the network [5].…”
Section: A Spiking Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…SNN can overcome the drawbacks of heuristic methods on data clustering [12]. SNNs are intriguing a direct result of their capacity to learn in a circulated manner utilizing a strategy called Spike Timing Dependent Plasticity (STDP) learning [11]. SNN structure that can be initially mapped to specific kinds of spikebased neuromorphic equipment with less execution loss [18].…”
Section: Introductionmentioning
confidence: 99%