Photonic neuromorphic computing is of particular interest due to its significant potential for ultrahigh computing speed and energy efficiency. The advantage of photonic computing hardware lies in its ultrawide bandwidth and parallel processing utilizing inherent parallelism. Here, we demonstrate a scalable on-chip photonic implementation of a simplified recurrent neural network, called a reservoir computer, using an integrated coherent linear photonic processor. In contrast to previous approaches, both the input and recurrent weights are encoded in the spatiotemporal domain by photonic linear processing, which enables scalable and ultrafast computing beyond the input electrical bandwidth. As the device can process multiple wavelength inputs over the telecom C-band simultaneously, we can use ultrawide optical bandwidth (~5 terahertz) as a computational resource. Experiments for the standard benchmarks showed good performance for chaotic time-series forecasting and image classification. The device is considered to be able to perform 21.12 tera multiplication–accumulation operations per second (MAC ∙ s−1) for each wavelength and can reach petascale computation speed on a single photonic chip by using wavelength division multiplexing. Our results are challenging for conventional Turing–von Neumann machines, and they confirm the great potential of photonic neuromorphic processing towards peta-scale neuromorphic super-computing on a photonic chip.
Ever-growing demand for artificial intelligence has motivated research on unconventional computation based on physical devices. While such computation devices mimic brain-inspired analog information processing, the learning procedures still rely on methods optimized for digital processing such as backpropagation, which is not suitable for physical implementation. Here, we present physical deep learning by extending a biologically inspired training algorithm called direct feedback alignment. Unlike the original algorithm, the proposed method is based on random projection with alternative nonlinear activation. Thus, we can train a physical neural network without knowledge about the physical system and its gradient. In addition, we can emulate the computation for this training on scalable physical hardware. We demonstrate the proof-of-concept using an optoelectronic recurrent neural network called deep reservoir computer. We confirmed the potential for accelerated computation with competitive performance on benchmarks. Our results provide practical solutions for the training and acceleration of neuromorphic computation.
We show a new family of neural networks based on the Schrödinger equation (SE-NET). In this analogy, the trainable weights of the neural networks correspond to the physical quantities of the Schrödinger equation. These physical quantities can be trained using the complex-valued adjoint method. Since the propagation of the SE-NET can be described by the evolution of physical systems, its outputs can be computed by using a physical solver. The trained network is transferable to actual optical systems. As a demonstration, we implemented the SE-NET with the Crank-Nicolson finite difference method on Pytorch. From the results of numerical simulations, we found that the performance of the SE-NET becomes better when the SE-NET becomes wider and deeper. However, the training of the SE-NET was unstable due to gradient explosions when SE-NET becomes deeper. Therefore, we also introduced phase-only training, which only updates the phase of the potential field (refractive index) in the Schrödinger equation. This enables stable training even for the deep SE-NET model because the unitarity of the system is kept under the training. In addition, the SE-NET enables a joint optimization of physical structures and digital neural networks. As a demonstration, we performed a numerical demonstration of end-to-end machine learning (ML) with an optical frontend toward a compact spectrometer. Our results extend the application field of ML to hybrid physical-digital optimizations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.