Proceedings 13th International Symposium on System Synthesis
DOI: 10.1109/isss.2000.874054
|View full text |Cite
|
Sign up to set email alerts
|

Artificial neural network implementation on a single FPGA of a pipelined on-line backpropagation

Abstract: The paper describes the implementation of a systolic array f o r a multiluyer perceptron on a Virtex XCV400 FPGA with a hardwure-friendl)* learning algorithm. A pipelitzed adaptation of the on-line backpropagation algorithm is shown. Parallelism is better exploited because both fonvard arid backward phases can be pelformed simultaneously.We can implement very large interconnection layers by using large Xilinx devices with embedded memories alongside the projection used in the systolic architecture. These physi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0
1

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 63 publications
(24 citation statements)
references
References 10 publications
0
22
0
1
Order By: Relevance
“…This implies that in deep networks, the weight updates must be disabled for durations that scale with the number of layers. We note that this problem, also referred to as "update locking" is also faced in standard implementations, which can be solved by pipelining the forward and backward passes (Gadea et al, 2000) or by estimating error gradients before they are computed using the output layer (Jaderberg et al, 2016;Czarnecki et al, 2017). The latter method is compatible with eRBP in principle as the authors demonstrated it using feedback alignment, and can provide a natural solution to this problem.…”
Section: Erbp Learning Dynamicsmentioning
confidence: 99%
“…This implies that in deep networks, the weight updates must be disabled for durations that scale with the number of layers. We note that this problem, also referred to as "update locking" is also faced in standard implementations, which can be solved by pipelining the forward and backward passes (Gadea et al, 2000) or by estimating error gradients before they are computed using the output layer (Jaderberg et al, 2016;Czarnecki et al, 2017). The latter method is compatible with eRBP in principle as the authors demonstrated it using feedback alignment, and can provide a natural solution to this problem.…”
Section: Erbp Learning Dynamicsmentioning
confidence: 99%
“…Most of the reported ones are custom ASIC implementations such as the GRD chip by Murakawa et al [61], on-chip backpropagation implementation of Ayala et al [15], CNAPS by Hammerstrom [62], MY-NEUPOWER by Sato et al [63], and FPNA by Farquhar, et al [66]. FPGA-based implementations of onchip training algorithms have also been reported such as the backpropagation algorithm implementations in [48,49,57,58]. An online trainable implementation of hyperbasis function networks has been reported in [60].…”
Section: Discussionmentioning
confidence: 99%
“…Hadley et al improved the approach of Eldredge by using partial reconfiguration of FPGAs instead of full-chip runtime reconfiguration [57]. Gadea et al demonstrate a pipelined implementation of the backpropagation algorithm in which the forward and backward passes of the algorithm can be processed in parallel on different training patterns, thus increasing the throughput [58]. Ayala et al demonstrated an ASIC implementation of MLPs with on-chip backpropagation training using floating point representation for real values and corresponding dedicated floating point hardware [15].…”
Section: Designmentioning
confidence: 99%
“…When ANNs based FPGAs system design specify the architecture of ANNs from a symbolic level. This level allows us using VHDL which stands for VHSIC (Very High Speed Integrated Circuit) Hardware Programming Language [7]. VHDL allows many levels of abstractions, and permits accurate description of electronic components ranging from simple logic gates to microprocessors.…”
Section: Field Programmable Gate Array and Very High Hardware Dementioning
confidence: 99%