2009
DOI: 10.1002/net.20311
|View full text |Cite
|
Sign up to set email alerts
|

k‐Splittable delay constrained routing problem: A branch‐and‐price approach

Abstract: Routing problems, which include a QoS-based path control, play a key role in broadband communication networks. We analyze here an algorithmic procedure based on branch-and-price algorithm and on the flow deviation method to solve a nonlinear k -splittable flow problem. The model can support end-to-end delay bounds on each path and we compare the behavior of the algorithm with and without these constraints. The trade-off between QoS guarantees and CPU time is clearly established and we show that minimizing the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…For what concerns RF we used the Python scikit-learn implementation and the following space of hyper-parameters have been explored through Grid search: number of estimators between 32 and 64, max depth between 32 and 48, max number of features to consider when looking for the best split between 64 and 192. As for NN we used the PyTorch framework and we considered the following ranges for the hyper-parameters: number of hidden layers in [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20], number of neurons per layer in [192 -264] (we use the empirical rule of having a number of neurons approximately equal to the number of input features or equal to the square root of the product between input features and output targets), learning rate in [10 −3 , 5 × 10 −3 ] and regularization rate λ in [10 −4 , 5 × 10 −3 ]. The neural network uses ReLU as activation function, Adam as optimizer and a batch size of 512 observations.…”
Section: A Machine Learning Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…For what concerns RF we used the Python scikit-learn implementation and the following space of hyper-parameters have been explored through Grid search: number of estimators between 32 and 64, max depth between 32 and 48, max number of features to consider when looking for the best split between 64 and 192. As for NN we used the PyTorch framework and we considered the following ranges for the hyper-parameters: number of hidden layers in [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20], number of neurons per layer in [192 -264] (we use the empirical rule of having a number of neurons approximately equal to the number of input features or equal to the square root of the product between input features and output targets), learning rate in [10 −3 , 5 × 10 −3 ] and regularization rate λ in [10 −4 , 5 × 10 −3 ]. The neural network uses ReLU as activation function, Adam as optimizer and a batch size of 512 observations.…”
Section: A Machine Learning Modelsmentioning
confidence: 99%
“…In the past, a number of analytical models to estimate the delay function (at link or end-to-end levels) have been proposed such as the Kleinrock function [2] and classic queuing models, like [3], [4], [5], [6], [7], [8] or polynomial functions [9], [10]. However, in realistic network scenarios, queuing models are either too simplistic (e.g., M/M/1 link delay model) to capture non-Poisson traffic distribution, sophisticated queuing disciplines, etc., or intractable.…”
Section: Introductionmentioning
confidence: 99%
“…As such, this complex field continues to be challenging to study; however, there are some recent findings on relative bandwidth allocation techniques (Martens & Skutella, 2006;Pompili, Scoglio, & Shoniregun, 2007;Truffot, Duhamel, & Mahey, 2010;Rassaki & Nel, 2015).…”
Section: Review Of Related Workmentioning
confidence: 99%