2022
DOI: 10.3390/app12105216
|View full text |Cite
|
Sign up to set email alerts
|

Sigmoid Activation Implementation for Neural Networks Hardware Accelerators Based on Reconfigurable Computing Environments for Low-Power Intelligent Systems

Abstract: The remarkable results of applying machine learning algorithms to complex tasks are well known. They open wide opportunities in natural language processing, image recognition, and predictive analysis. However, their use in low-power intelligent systems is restricted because of high computational complexity and memory requirements. This group includes a wide variety of devices, from smartphones and Internet of Things (IoT)smart sensors to unmanned aerial vehicles (UAVs), self-driving cars, and nodes of Edge Com… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 28 publications
0
7
0
Order By: Relevance
“…The literature review allowed us to identify the following set of operations: "signal source" (SRC), "signal transfer" (TRS), "multiply and accumulate" (MAC), "parametric ReLU" (PRL) "maximum" (MAX), "minimum" (MIN), "gate" (GAT), "union" (U), "delay" (DEL), and "block" (BLK) (Figure 3). These operations are sufficient for implementing the key layers of neural networks (dense, convolution, pooling) as well as activations (sigmoid, tanh, ELU, and other) [21,[23][24][25]. This paper focuses on the implementation of SoftMax activation.…”
Section: Implementation Of Neural Network In Rcementioning
confidence: 99%
See 4 more Smart Citations
“…The literature review allowed us to identify the following set of operations: "signal source" (SRC), "signal transfer" (TRS), "multiply and accumulate" (MAC), "parametric ReLU" (PRL) "maximum" (MAX), "minimum" (MIN), "gate" (GAT), "union" (U), "delay" (DEL), and "block" (BLK) (Figure 3). These operations are sufficient for implementing the key layers of neural networks (dense, convolution, pooling) as well as activations (sigmoid, tanh, ELU, and other) [21,[23][24][25]. This paper focuses on the implementation of SoftMax activation.…”
Section: Implementation Of Neural Network In Rcementioning
confidence: 99%
“…We briefly discuss the implementation of PLA in the RCE using the example of sigmoid activation described in detail in the paper [25]. This implementation is based on the approximation with equal subranges-the entire approximation range is divided into several subranges of equal width.…”
Section: Implementation Of the Exponential Functionmentioning
confidence: 99%
See 3 more Smart Citations