2018
DOI: 10.1109/tnnls.2017.2770179
|View full text |Cite
|
Sign up to set email alerts
|

Online Training of LSTM Networks in Distributed Systems for Variable Length Data Sequences

Abstract: In this brief, we investigate online training of long short term memory (LSTM) architectures in a distributed network of nodes, where each node employs an LSTM-based structure for online regression. In particular, each node sequentially receives a variable length data sequence with its label and can only exchange information with its neighbors to train the LSTM architecture. We first provide a generic LSTM-based regression structure for each node. In order to train this structure, we put the LSTM equations in … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
42
0
5

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 92 publications
(47 citation statements)
references
References 18 publications
0
42
0
5
Order By: Relevance
“…In RNN section, LSTM consider as the determination with four important factors (forget gate (f g), input gate (I g), output gate (O g) and cell state) have an ample usage for the image analysis along with text and audio signal analysis but is extensively usage in time series analysis, transcribed analysis, voice recognition and health testimony [22]. The major detriment of the RNN model was vanishing gradient problem, LSTM increased the input and output capability of RNN to solve these issues and it uses logical memory to learn sequence vector.…”
Section: Review Of Relevant Workmentioning
confidence: 99%
“…In RNN section, LSTM consider as the determination with four important factors (forget gate (f g), input gate (I g), output gate (O g) and cell state) have an ample usage for the image analysis along with text and audio signal analysis but is extensively usage in time series analysis, transcribed analysis, voice recognition and health testimony [22]. The major detriment of the RNN model was vanishing gradient problem, LSTM increased the input and output capability of RNN to solve these issues and it uses logical memory to learn sequence vector.…”
Section: Review Of Relevant Workmentioning
confidence: 99%
“…The LSTM network is a time recurrent neural network that is an enhancement to the RNN neural network and is suitable for processing and predicting time series [13][14][15][16] . In this paper, the LSTM network model is used to modify the carrier trajectory.…”
Section: Track Correcting With Lstm Modelmentioning
confidence: 99%
“…However, the carrier motion trajectory in the time period close to the current time is highly correlated with the trajectory of the carrier at the next time, and thus the partial information remains. The forgetting gate output expression is as shown in equation (15). 1 ( )…”
Section: Fig 4 Lstm Network Cell Structurementioning
confidence: 99%
“…Toplu egitimde tüm veri ulaşılabilir durumdadır ve birlikte işlenir. Ancak, büyük veri uygulamalarında, tüm verinin aynı yerde tutulması çeşitli depolama sorunlarına neden olmaktadır [4]. Ek olarak, birçok uygulamada veriler ardışık olarak elde edilmektedir ve bu durum toplu egitimin kullanılmasını önlemektedir.…”
Section: unclassified