2021
DOI: 10.1038/s41467-021-25427-4
|View full text |Cite
|
Sign up to set email alerts
|

Deep neural networks using a single neuron: folded-in-time architecture using feedback-modulated delay loops

Abstract: Deep neural networks are among the most widely applied machine learning tools showing outstanding performance in a broad range of tasks. We present a method for folding a deep neural network of arbitrary size into a single neuron with multiple time-delayed feedback loops. This single-neuron deep neural network comprises only a single nonlinearity and appropriately adjusted modulations of the feedback signals. The network states emerge in time as a temporal unfolding of the neuron’s dynamics. By adjusting the f… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
18
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 40 publications
(23 citation statements)
references
References 44 publications
0
18
0
Order By: Relevance
“…This multi-classification ANN model, after being given a training set, extracted and classified the dynamic behavioral features of multiple touches, and then calculated the output of each network node in the neural network through the feed-forward back-propagation (BPP) algorithm for each training set. In practical application, the feed-forward BPP algorithm was mainly used in supervised learning technology of neural network due to its ability of general approximation and simple design [ 42 ]. The supervised learning provided the neural network with input data and output result data, and the weight was updated iteratively to reduce the difference between the actual output value and the expected output value.…”
Section: Resultsmentioning
confidence: 99%
“…This multi-classification ANN model, after being given a training set, extracted and classified the dynamic behavioral features of multiple touches, and then calculated the output of each network node in the neural network through the feed-forward back-propagation (BPP) algorithm for each training set. In practical application, the feed-forward BPP algorithm was mainly used in supervised learning technology of neural network due to its ability of general approximation and simple design [ 42 ]. The supervised learning provided the neural network with input data and output result data, and the weight was updated iteratively to reduce the difference between the actual output value and the expected output value.…”
Section: Resultsmentioning
confidence: 99%
“…Although time delays have been introduced to neural networks in both discrete time (Waibel et al (1989); Zhang et al (2022)) and continuous time (Marcus and Westervelt (1989); Stelzer et al (2021)), these delays were fixed rather than treated as parameters to be learned. In this work, we consider a network with trainable time delays, a concept that was originally proposed in Ji et al (2020).…”
Section: Introductionmentioning
confidence: 99%
“…In this comment, we want to draw attention to recent studies published in Nature Communications 1 , 2 , which connect elements of this hierarchy of methods and facilitate the transfer of knowledge between research fields.…”
mentioning
confidence: 99%
“…In 2 the authors take this idea of using only a single physical node with delay and extend it to emulate deep neural networks, an approach which they have coined Folded-in-time Deep Neural Network (Fit-DNN). This is achieved by having multiple delay loops with adjustable feedback strengths, while the physical node supplies the nonlinearity (see Fig.…”
mentioning
confidence: 99%
See 1 more Smart Citation