Less is more: discrete starting solutions in the planar p-median problem

Top ◽  
2021 ◽  
Author(s):  
Pawel Kalczynski ◽  
Jack Brimberg ◽  
Zvi Drezner
2019 ◽  
Vol 27 (1) ◽  
pp. 480-493 ◽  
Author(s):  
Nenad Mladenović ◽  
Abdulaziz Alkandari ◽  
Jun Pei ◽  
Raca Todosijević ◽  
Panos M. Pardalos

Author(s):  
Jack Brimberg ◽  
Zvi Drezner

In this paper we present two new approaches for finding good starting solutions to the planar p-median problem. Both methods rely on a discrete approximation of the continuous model that restricts the facility locations to the given set of demand points. The first method adapts the first phase of a greedy random construction algorithm proposed for the minimum sum of squares clustering problem. The second one implements a simple descent procedure based on vertex exchange. The resulting solution is then used as a starting point in a local search heuristic that iterates between the well-known Cooper's alternating locate-allocate method and a transfer follow-up step with a new and more effective selection rule. Extensive computational experiments show that (1) using good starting solutions can significantly improve the performance of local search, and (2) using a hybrid algorithm that combines good starting solutions with a "deep" local search can be an effective strategy for solving a diversity of planar p-median problems.


2012 ◽  
Vol 39 ◽  
pp. 225-232 ◽  
Author(s):  
Jack Brimberg ◽  
Zvi Drezner ◽  
Nenad Mladenovic ◽  
Said Salhi

2018 ◽  
Vol 28 (2) ◽  
pp. 153-169 ◽  
Author(s):  
Kayo Gonçalves-E-Silva ◽  
Daniel Aloise ◽  
Samuel Xavier-De-Souza ◽  
Nenad Mladenovic

Nelder-Mead method (NM) for solving continuous non-linear optimization problem is probably the most cited and the most used method in the optimization literature and in practical applications, too. It belongs to the direct search methods, those which do not use the first and the second order derivatives. The popularity of NM is based on its simplicity. In this paper we propose even more simple algorithm for larger instances that follows NM idea. We call it Simplified NM (SNM): instead of generating all n + 1 simplex points in Rn, we perform search using just q + 1 vertices, where q is usually much smaller than n. Though the results cannot be better than after performing calculations in n+1 points as in NM, significant speed-up allows to run many times SNM from different starting solutions, usually getting better results than those obtained by NM within the same cpu time. Computational analysis is performed on 10 classical convex and non-convex instances, where the number of variables n can be arbitrarily large. The obtained results show that SNM is more effective than the original NM, confirming that LIMA yields good results when solving a continuous optimization problem.


PsycCRITIQUES ◽  
2012 ◽  
Vol 57 (21) ◽  
Author(s):  
Catherine Scott
Keyword(s):  

2011 ◽  
Author(s):  
S. Massol ◽  
K. Midgley ◽  
P. J. Holcomb ◽  
J. Grainger

Sign in / Sign up

Export Citation Format

Share Document