2022
DOI: 10.3390/app122312478
|View full text |Cite
|
Sign up to set email alerts
|

FNNS: An Effective Feedforward Neural Network Scheme with Random Weights for Processing Large-Scale Datasets

Abstract: The size of datasets is growing exponentially as information technology advances, and it is becoming more and more crucial to provide efficient learning algorithms for neural networks to handle massive amounts of data. Due to their potential for handling huge datasets, feed-forward neural networks with random weights (FNNRWs) have drawn a lot of attention. In this paper, we introduced an efficient feed-forward neural network scheme (FNNS) for processing massive datasets with random weights. The FNNS divides la… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 35 publications
0
9
0
Order By: Relevance
“…RNNs are designed specifically to handle sequential data, where the timing and order of the information are crucial. Recurrent connections, not present in FNNs or CNNs, enable information to pass from input to output and through loops that feed data from earlier time steps into the current step [65,83]. This qualifies them for jobs like time series analysis, speech recognition, and natural language processing.…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
See 4 more Smart Citations
“…RNNs are designed specifically to handle sequential data, where the timing and order of the information are crucial. Recurrent connections, not present in FNNs or CNNs, enable information to pass from input to output and through loops that feed data from earlier time steps into the current step [65,83]. This qualifies them for jobs like time series analysis, speech recognition, and natural language processing.…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
“…This qualifies them for jobs like time series analysis, speech recognition, and natural language processing. RNNs can recognize temporal connections and context in the data since they have a memory of past inputs [83]. However, due to problems with disappearing gradients in lengthy sequences, advanced variations, such as Long Short-Term Memory (LSTM) [84] and Gated Recurrent Unit (GRU) networks, have been developed.…”
Section: Recurrent Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations