2022
DOI: 10.1007/s10489-022-03783-y
|View full text |Cite
|
Sign up to set email alerts
|

Optimization of the structural complexity of artificial neural network for hardware-driven neuromorphic computing application

Abstract: This work focuses on the optimization of the structural complexity of a single-layer feedforward neural network (SLFN) for neuromorphic hardware implementation. The singular value decomposition (SVD) method is used for the determination of the effective number of neurons in the hidden layer for Modified National Institute of Standards and Technology (MNIST) dataset classification. The proposed method is also verified on a SLFN using weights derived from a synaptic transistor device. The effectiveness of this m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 12 publications
(13 citation statements)
references
References 53 publications
0
9
0
Order By: Relevance
“…The error in the output neuron prediction is then computed using the mean cross-entropy loss function which is commonly used as the cost function in multiclass logistic regression problems. 51 The as-computed errors are then backpropagated to adjust the weight values using the stochastic gradient descent algorithm. 52 This sequence of events are repeated for requirement of 6σ separation between conductance states is impractical from a device point of view.…”
Section: ■ Results and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The error in the output neuron prediction is then computed using the mean cross-entropy loss function which is commonly used as the cost function in multiclass logistic regression problems. 51 The as-computed errors are then backpropagated to adjust the weight values using the stochastic gradient descent algorithm. 52 This sequence of events are repeated for requirement of 6σ separation between conductance states is impractical from a device point of view.…”
Section: ■ Results and Discussionmentioning
confidence: 99%
“…The SVHN data set consists of color (RGB) images of house numbers extracted from Google Street View photos of 32 × 32 resolution. The SVHN data set is a more complex and multichannel data set as compared to the popular greyscale image data sets like the Modified National Institute of Standards and Technology (MNIST) data set and the Fashion Modified National Institute of Standards and Technology (FMNIST) data set which have been extensively used in previous reports . We have specifically chosen the SVHN data set in order to evaluate the synaptic device performance on a more practical data set with complex input data representations as observed in real-world applications.…”
Section: Resultsmentioning
confidence: 99%
“…Apart from the pattern recognition accuracy, the capability of the device based ANN model in differentiating the different output classes with a high statistical confidence is equally important for pattern recognition tasks. [ 43 ] Hence, we have used a confusion matrix for evaluating the software based (Figure 7d) and device based (Figure 7e) performance of the ANNs class distinction capability. Confusion matrices are represented as square matrices with the true output class labels along the matrix rows and model predicted class labels along the columns.…”
Section: Resultsmentioning
confidence: 99%
“…We observe that the device-based FCN consists a high capability for output class separation which is seen from the high value of diagonal elements and extremely low value of off-diagonal elements. Further, we also investigated the effect of weight level quantization [47,48] in the pattern recognition accuracy. Reducing the number of synaptic weights while maintaining similar performance levels can improve the neuromorphic chip area and energy efficiency.…”
Section: Resultsmentioning
confidence: 99%