2022
DOI: 10.1109/access.2022.3199372
|View full text |Cite
|
Sign up to set email alerts
|

Network Traffic Prediction Based on LSTM and Transfer Learning

Abstract: The increasing amount of traffic in recent years has led to increasingly complex network problems. To be able to improve overall network performance and increase network utilization, it is valuable to take measures to capture future trends in network traffic. In traditional machine learning, to guarantee the accuracy and high reliability of the models obtained through training, there are two basic assumptions: (1) the training samples used for learning and the new test samples satisfy the condition of independ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(12 citation statements)
references
References 19 publications
(17 reference statements)
0
12
0
Order By: Relevance
“…GraphSAGE has also been used to model spatial dependency for inductive learning ( Liu et al, 2023 ; Liu, Ong & Chen, 2022 ). Temporal modules based on RNN and its LSTM, as well as GRU, have been introduced to learn temporal dependence ( Pan et al, 2022 ; Subramaniyan et al, 2023 ; Bao et al, 2022 ; Shu, Cai & Xiong, 2022 ; Wan et al, 2022 ). To improve computational efficiency, some studies employed CNN instead of RNN to model temporal correlation ( Ji, Yu & Lei, 2023 ; Zhang et al, 2022 ).…”
Section: Literary Reviewmentioning
confidence: 99%
“…GraphSAGE has also been used to model spatial dependency for inductive learning ( Liu et al, 2023 ; Liu, Ong & Chen, 2022 ). Temporal modules based on RNN and its LSTM, as well as GRU, have been introduced to learn temporal dependence ( Pan et al, 2022 ; Subramaniyan et al, 2023 ; Bao et al, 2022 ; Shu, Cai & Xiong, 2022 ; Wan et al, 2022 ). To improve computational efficiency, some studies employed CNN instead of RNN to model temporal correlation ( Ji, Yu & Lei, 2023 ; Zhang et al, 2022 ).…”
Section: Literary Reviewmentioning
confidence: 99%
“…At present, the position prediction algorithm has been more mature, such as work [38][39][40][41][42]. Considering the accuracy of prediction and the complexity of the algorithm for edge nodes, we use the long short-term memory (LSTM) model to predict the position of u i based on the historical position of u i , and we call it point N p (x p , y p ) .…”
Section: Location Predictionmentioning
confidence: 99%
“…According to Figure 4, the output gate regulates how cell activations exit into the remainder of the network. The memory block afterwards received the forget gate [17]. Before adding it as input to the cell through the self-recurrent connection, the forget gate scales the internal state of the cell [18], [19].…”
Section: Recurrent Neural Networkmentioning
confidence: 99%