2020
DOI: 10.1002/adma.202003610
|View full text |Cite
|
Sign up to set email alerts
|

Neuromorphic Engineering: From Biological to Spike‐Based Hardware Nervous Systems

Abstract: The human brain is a sophisticated, high-performance biocomputer that processes multiple complex tasks in parallel with high efficiency and remarkably low power consumption. Scientists have long been pursuing an artificial intelligence (AI) that can rival the human brain. Spiking neural networks based on neuromorphic computing platforms simulate the architecture and information processing of the intelligent brain, providing new insights for building AIs. The rapid development of materials engineering, device p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
140
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 172 publications
(141 citation statements)
references
References 221 publications
(355 reference statements)
0
140
0
Order By: Relevance
“…[ 1–5 ] The neuron functions simultaneously as a processor and a memory unit, and exhibits high parallelism, low power consumption, and fault tolerance. [ 6–8 ] However, the internal processer and memory of the traditional von Neumann computer are physically separated from each other, which takes a lot of time to transfer data, resulting in very low real‐tivme processing efficiency for visual information perception and complex speech recognition. [ 9–11 ] To solve this problem and to fulfill our increasing desire for data and information, various synaptic devices have been developed, such as memristors, [ 12 ] resistive switching memory, [ 13 ] and field effect synaptic transistor.…”
Section: Introductionmentioning
confidence: 99%
“…[ 1–5 ] The neuron functions simultaneously as a processor and a memory unit, and exhibits high parallelism, low power consumption, and fault tolerance. [ 6–8 ] However, the internal processer and memory of the traditional von Neumann computer are physically separated from each other, which takes a lot of time to transfer data, resulting in very low real‐tivme processing efficiency for visual information perception and complex speech recognition. [ 9–11 ] To solve this problem and to fulfill our increasing desire for data and information, various synaptic devices have been developed, such as memristors, [ 12 ] resistive switching memory, [ 13 ] and field effect synaptic transistor.…”
Section: Introductionmentioning
confidence: 99%
“…For the past decades, with the explosive growth in demand for big data processing, the conventional computing technology based on complementary silicon metal-oxide-semiconductor investigated. [5,[18][19][20] Compared with the two-terminal synaptic memristors, the three-terminal structured synaptic transistors are believed to benefit the control of synaptic weight due to the avoidance of setting training and testing inputs in the same terminal. [21][22][23] Typically, for synaptic transistors, the applied electrical input on the gate electrode (V GS ) or the optical input on the channel is regarded as the training signals.…”
Section: Introductionmentioning
confidence: 99%
“…54 With enhanced stimulus intensity, both forward and reverse pH polarization generate growing output synaptic weights (Figure 6d), which is analogy to the potentiation behavior of synaptic strength and obeys Hebbian learning theory. 55 Switching between positive and negative potentiation states can be realized by controlling the pH polarization. This feature is different from previous artificial synapses that both potentiation and depression are found under electric or optical stimulus.…”
Section: Resultsmentioning
confidence: 99%