2020
DOI: 10.1109/access.2020.2995511
|View full text |Cite
|
Sign up to set email alerts
|

Deep Q-Learning for Routing Schemes in SDN-Based Data Center Networks

Abstract: In order to adapt to the rapid development of cloud computing, big data, and other technologies, the combination of data center networks and SDN is proposed to make network management more convenient and flexible. With this advantage, routing strategies have been extensively studied by researchers. However, the strategies in the controller mainly rely on manual design, the optimal solutions are difficult to be obtained in the dynamic network environment. So the strategies based on artificial intelligence (AI) … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
20
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 54 publications
(21 citation statements)
references
References 19 publications
0
20
0
1
Order By: Relevance
“…To the best of our knowledge, our study is the first to directly represent flow routing in the stateaction space of reinforcement learning. For completeness we note that deep reinforcement learning has been applied to a wide variety of other communication network problems, including distributed routing [58], [59], congestion control [60], data center networks [61], wireless network routing [62]- [71], vehicular ad hoc network routing [72], [73], optical networking [74]- [76], caching [77], and mobile edge computing [78], [79]. We also note that a preprocessing approach for efficiently representing virtual network embeddings for subsequent algorithm processing has been examined in [80].…”
Section: Review Of Related Workmentioning
confidence: 99%
“…To the best of our knowledge, our study is the first to directly represent flow routing in the stateaction space of reinforcement learning. For completeness we note that deep reinforcement learning has been applied to a wide variety of other communication network problems, including distributed routing [58], [59], congestion control [60], data center networks [61], wireless network routing [62]- [71], vehicular ad hoc network routing [72], [73], optical networking [74]- [76], caching [77], and mobile edge computing [78], [79]. We also note that a preprocessing approach for efficiently representing virtual network embeddings for subsequent algorithm processing has been examined in [80].…”
Section: Review Of Related Workmentioning
confidence: 99%
“…al. [38] presented a routing strategy based on deep Qlearning (DQL) to generate optimal routing paths autonomously for SDN-based data center networks. However, they aimed to provide different quality of service guarantees for mice-flows and elephant-flows, designated in a data center network.…”
Section: Related Workmentioning
confidence: 99%
“…In the control plane, the well-established Ryu controller [39] is adopted. There are three modules in the controller, topology discovery, period monitor, and an ANN model, which are used to explore In recent years, several routing mechanisms [32][33][34][35][36][37][38] combined with AI technology have been proposed to enhance learning ability from past experiences and smart route-decision capability, thereby improving overall network performance. First of all, the works in [33][34][35] introduced AI technology to routing protocols for wireless sensor networks in order to improve the energy consumption of each node.…”
Section: System Architecturementioning
confidence: 99%
See 1 more Smart Citation
“…Also, the use of RL has been reported in application areas such as energy-efficient routing in SDN-enabled networks. Other studies have considered the use of RL in addressing routing with QoSguarantees [122] [131]. However, such studies are based on traditional table-based agents which cannot support the need for efficient solutions in the condition of unseen network topology state [109].…”
Section: ) Analysis Of Selected Load Balancing and Energy-efficient mentioning
confidence: 99%