2021
DOI: 10.1007/s10732-021-09466-0
|View full text |Cite
|
Sign up to set email alerts
|

Weighted proximity search

Abstract: Proximity search is an iterative method to solve complex mathematical programming problems. At each iteration, the objective function of the problem at hand is replaced by the Hamming distance function to a given solution, and a cutoff constraint is added to impose that any new obtained solution improves the objective function value. A mixed integer programming solver is used to find a feasible solution to this modified problem, yielding an improved solution to the original problem. This paper introduces the c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 26 publications
0
6
0
Order By: Relevance
“…This metric was introduced in Rodrigues et al. (2021, 2022) and provides a global perspective on the number of best and worst solutions found by a given approach when compared to a reference approach (the traditional BLH in this case). To compute the value of this metric, we start by computing for each instance ifalse{1,,10false}$i\in \lbrace 1, \ldots , 10\rbrace$ with a number of vessels Nfalse{20,0.33em30,0.33em40,0.33em50,0.33em60,0.33em70false}$N\in \lbrace 20, \ 30, \ 40, \ 50, \ 60, \ 70\rbrace$ the value πi,Nbadbreak={1,ifzi,NX<zi,NBLH,0,ifzi,NX=zi,NBLH,1,ifzi,NX>zi,NBLH,$$\begin{equation} \pi _{i,N} = {\begin{cases} 1, & \text{if }\ z_{i,N}^{X}&lt;z^{BLH}_{i,N}, \\[6pt] 0, & \text{if }\ z_{i,N}^{X}=z^{BLH}_{i,N}, \\[6pt] -1, & \text{if }\ z_{i,N}^{X}&gt;z^{BLH}_{i,N}, \end{cases}} \end{equation}$$where zi,NBLH$z^{BLH}_{i,N}$ and zi,NX$z_{i,N}^{X}$ are the total waiting time obtained, respectively, by the traditional BLH and by approach Xfalse{HA,0.33emRB,0.33emI...…”
Section: Computational Resultsmentioning
confidence: 99%
“…This metric was introduced in Rodrigues et al. (2021, 2022) and provides a global perspective on the number of best and worst solutions found by a given approach when compared to a reference approach (the traditional BLH in this case). To compute the value of this metric, we start by computing for each instance ifalse{1,,10false}$i\in \lbrace 1, \ldots , 10\rbrace$ with a number of vessels Nfalse{20,0.33em30,0.33em40,0.33em50,0.33em60,0.33em70false}$N\in \lbrace 20, \ 30, \ 40, \ 50, \ 60, \ 70\rbrace$ the value πi,Nbadbreak={1,ifzi,NX<zi,NBLH,0,ifzi,NX=zi,NBLH,1,ifzi,NX>zi,NBLH,$$\begin{equation} \pi _{i,N} = {\begin{cases} 1, & \text{if }\ z_{i,N}^{X}&lt;z^{BLH}_{i,N}, \\[6pt] 0, & \text{if }\ z_{i,N}^{X}=z^{BLH}_{i,N}, \\[6pt] -1, & \text{if }\ z_{i,N}^{X}&gt;z^{BLH}_{i,N}, \end{cases}} \end{equation}$$where zi,NBLH$z^{BLH}_{i,N}$ and zi,NX$z_{i,N}^{X}$ are the total waiting time obtained, respectively, by the traditional BLH and by approach Xfalse{HA,0.33emRB,0.33emI...…”
Section: Computational Resultsmentioning
confidence: 99%
“…Values of this metric greater than 100, indicate a better average performance of the WILB comparing to the LBH. The second metric (Rodrigues et al 2021) provides a global perspective on the number of best solutions found by the WILB comparing to the LBH. To calculate this value, we start by computing for each instance h ∈ {1, 2, 3, 4} in class k ∈ K the value…”
Section: Resultsmentioning
confidence: 99%
“…However, this function also does not provide guidance to distinguish between the quality of equally distant solutions, nor does it incorporate available information regarding the preferred values of variables. In a recent paper (Rodrigues et al 2021) a proximity search heuristic using different weights for individual variables was successfully applied to three different combinatorial optimization problems. This provided evidence that it can be useful, prior to exploring a neighborhood of solutions, to evaluate the possible consequences of flipping binary variables and weighting the variables accordingly when performing the neighborhood exploration.…”
Section: Introductionmentioning
confidence: 99%
“…For future works, we consider improving the parameter setting of our local branching algorithm and developing other algorithmic features such as the inclusion of valid inequalities and new branching schemes. Moreover, we consider implementing improved variants of local branching such as Fischetti and Monaci (2014) and Rodrigues et al (2021). Also, we consider using a completely different model with tighter linear relaxation bounds.…”
Section: Discussionmentioning
confidence: 99%