2021
DOI: 10.1088/1361-6641/ac251b
|View full text |Cite
|
Sign up to set email alerts
|

Training of quantized deep neural networks using a magnetic tunnel junction-based synapse

Abstract: Quantized neural networks (QNNs) are being actively researched as a solution for the computational complexity and memory intensity of deep neural networks. This has sparked efforts to develop algorithms that support both inference and training with quantized weight and activation values, without sacrificing accuracy. A recent example is the GXNOR framework for stochastic training of ternary and binary neural networks (TNNs and BNNs, respectively). In this paper, we show how magnetic tunnel junction (MTJ) devic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 34 publications
0
1
0
Order By: Relevance
“…109 Magnetic tunnel junctions emulate binary weights by switching between two magnetization states. [110][111][112] The intrinsic stochastic nature of magnetization switching in twostate magnetic tunnel junctions can also be leveraged for learning. 113 Memristive behavior is obtained by modifying the magnetization texture to obtain gradual switching via spin-torque 114,115 or spin-orbit torques.…”
Section: B Magnetization Dynamicsmentioning
confidence: 99%
“…109 Magnetic tunnel junctions emulate binary weights by switching between two magnetization states. [110][111][112] The intrinsic stochastic nature of magnetization switching in twostate magnetic tunnel junctions can also be leveraged for learning. 113 Memristive behavior is obtained by modifying the magnetization texture to obtain gradual switching via spin-torque 114,115 or spin-orbit torques.…”
Section: B Magnetization Dynamicsmentioning
confidence: 99%