2020
DOI: 10.1038/s41467-020-18098-0
|View full text |Cite
|
Sign up to set email alerts
|

Committee machines—a universal method to deal with non-idealities in memristor-based neural networks

Abstract: Artificial neural networks are notoriously power- and time-consuming when implemented on conventional von Neumann computing systems. Consequently, recent years have seen an emergence of research in machine learning hardware that strives to bring memory and computing closer together. A popular approach is to realise artificial neural networks in hardware by implementing their synaptic weights using memristive devices. However, various device- and system-level non-idealities usually prevent these physical implem… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
40
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 60 publications
(41 citation statements)
references
References 32 publications
1
40
0
Order By: Relevance
“…A common approach exploits resistive memory crossbar arrays to implement in analog the vector matrix multiplication in a single step [45]. However, reliability issues affecting RRAM devices, such as the large cycle-to-cycle variability, limit the number of bits that can be reliably stored in a single device [49] thus suggesting that low-bit precision neural networks are a more suitable solution for the current state of the art RRAM devices. The extreme case of low-bit precision neural networks are BNNs, which have been shown to retain high classification accuracy despite the use of single-bit neuron weights and activations [5,50,51].…”
Section: Binarized Neural Network Applicationsmentioning
confidence: 99%
“…A common approach exploits resistive memory crossbar arrays to implement in analog the vector matrix multiplication in a single step [45]. However, reliability issues affecting RRAM devices, such as the large cycle-to-cycle variability, limit the number of bits that can be reliably stored in a single device [49] thus suggesting that low-bit precision neural networks are a more suitable solution for the current state of the art RRAM devices. The extreme case of low-bit precision neural networks are BNNs, which have been shown to retain high classification accuracy despite the use of single-bit neuron weights and activations [5,50,51].…”
Section: Binarized Neural Network Applicationsmentioning
confidence: 99%
“…Then we calculate the mean of the results. Joksas et al did this by applying the committee machine theory into the in-memory computing devices [30]. Wan et al optimized this process by running a single model on the same device and reading the memory cells multiple times [31].…”
Section: Related Workmentioning
confidence: 99%
“…Due to the limitations of the Von Neumann architecture, the traditional storage-separated computing approach can no longer meet the fast-growing demand for big data computing [1][2][3]. Recently, people have started to turn their attention to storageintegrated bio-neural computing due to that there are 10 11 neurons and 10 15 synapses in the human brain, which can integrate complex cognitive functions such as perception, calculation, logic, language, vision, and memory [4][5][6].…”
Section: Introductionmentioning
confidence: 99%