Stretchable devices can form intimate interfaces with the attached objects, giving birth to widespread applications in wearable electronics, bioelectronics, and artificial bionics. The emerging 2D materials are considered to be ideal candidates for stretchable electronics due to their ultra‐thin nature and excellent mechanical properties. However, stretchable 2D semiconductor devices previously demonstrated usually work at insufficient strain range with poor device performances mostly due to a mechanical failure. Here, the fabrication of buckled monolayer molybdenum disulfide (MoS2) field effect transistors (FETs) on elastomeric substrates is reported. These stretchable MoS2 FETs show stable performances with mobility of ≈30 cm2 V−1 s−1, on/off ratio of ≈108, and subthreshold swing (SS) of ≈180 mV dec−1 after many cycled stretching‐release processes under more than 10% strain. In particular, the feasibility of applying these stretchable MoS2 transistors in optoelectronic synapse and neural network simulation in recognition tasks has been demonstrated.
Recent years have witnessed a surge of interest in learning representations of graph-structured data, with applications from social networks to drug discovery. However, graph neural networks, the machine learning models for handling graph-structured data, face significant challenges when running on conventional digital hardware, including the slowdown of Moore’s law due to transistor scaling limits and the von Neumann bottleneck incurred by physically separated memory and processing units, as well as a high training cost. Here we present a hardware–software co-design to address these challenges, by designing an echo state graph neural network based on random resistive memory arrays, which are built from low-cost, nanoscale and stackable resistors for efficient in-memory computing. This approach leverages the intrinsic stochasticity of dielectric breakdown in resistive switching to implement random projections in hardware for an echo state network that effectively minimizes the training complexity thanks to its fixed and random weights. The system demonstrates state-of-the-art performance on both graph classification using the MUTAG and COLLAB datasets and node classification using the CORA dataset, achieving 2.16×, 35.42× and 40.37× improvements in energy efficiency for a projected random resistive memory-based hybrid analogue–digital system over a state-of-the-art graphics processing unit and 99.35%, 99.99% and 91.40% reductions of backward pass complexity compared with conventional graph learning. The results point to a promising direction for next-generation artificial intelligence systems for graph learning.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.