volume 12, issue 3, P117 2021
DOI: 10.3390/info12030117
View full text
|
|
Share
Uche Onyekpe, Vasile Palade, Stratis Kanarachos, Stavros-Richard Christopoulos

Abstract: Recurrent Neural Networks (RNNs) are known for their ability to learn relationships within temporal sequences. Gated Recurrent Unit (GRU) networks have found use in challenging time-dependent applications such as Natural Language Processing (NLP), financial analysis and sensor fusion due to their capability to cope with the vanishing gradient problem. GRUs are also known to be more computationally efficient than their variant, the Long Short-Term Memory neural network (LSTM), due to their less complex structur…

expand abstract