This work presents 2-bits/cell operation in deeply scaled ferroelectric finFETs (Fe-finFET) with a 1 µs write pulse of maximum ±5 V amplitude and WRITE endurance above 109 cycles. Fe-finFET devices with single and multiple fins have been fabricated on an SOI wafer using a gate first process, with gate lengths down to 70 nm and fin width 20 nm. Extrapolated retention above 10 years also ensures stable inference operation for 10 years without any need for re-training. Statistical modeling of device-to-device and cycle-to-cycle variation is performed based on measured data and applied to neural network simulations using the CIMulator software platform. Stochastic device-to-device variation is mainly compensated during online training and has virtually no impact on training accuracy. On the other hand, stochastic cycle-to-cycle threshold voltage variation up to 400 mV can be tolerated for MNIST handwritten digits recognition. A substantial inference accuracy drop with systematic retention degradation was observed in analog neural networks. However, quaternary neural networks (QNNs) and binary neural networks (BNNs) with Fe-finFETs as synaptic devices demonstrated excellent immunity toward the cumulative impact of stochastic and systematic variations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.