scholarly journals The FastMap Algorithm for Shortest Path Computations

Author(s):  
Liron Cohen ◽  
Tansel Uras ◽  
Shiva Jahangiri ◽  
Aliyah Arunasalam ◽  
Sven Koenig ◽  
...  

We present a new preprocessing algorithm for embedding the nodes of a given edge-weighted undirected graph into a Euclidean space. The Euclidean distance between any two nodes in this space approximates the length of the shortest path between them in the given graph. Later, at runtime, a shortest path between any two nodes can be computed with an A* search using the Euclidean distances as heuristic. Our preprocessing algorithm, called FastMap, is inspired by the data-mining algorithm of the same name and runs in near-linear time. Hence, FastMap is orders of magnitude faster than competing approaches that produce a Euclidean embedding using Semidefinite Programming. FastMap also produces admissible and consistent heuristics and therefore guarantees the generation of shortest paths. Moreover, FastMap applies to general undirected graphs for which many traditional heuristics, such as the Manhattan Distance heuristic, are not well defined. Empirically, we demonstrate that A* search using the FastMap heuristic is competitive with A* search using other state-of-the-art heuristics, such as the Differential heuristic.

Mathematics ◽  
2021 ◽  
Vol 9 (14) ◽  
pp. 1592
Author(s):  
Iztok Peterin ◽  
Gabriel Semanišin

A shortest path P of a graph G is maximal if P is not contained as a subpath in any other shortest path. A set S⊆V(G) is a maximal shortest paths cover if every maximal shortest path of G contains a vertex of S. The minimum cardinality of a maximal shortest paths cover is called the maximal shortest paths cover number and is denoted by ξ(G). We show that it is NP-hard to determine ξ(G). We establish a connection between ξ(G) and several other graph parameters. We present a linear time algorithm that computes exact value for ξ(T) of a tree T.


10.37236/727 ◽  
2008 ◽  
Vol 15 (1) ◽  
Author(s):  
Iiro Honkala ◽  
Tero Laihonen

Assume that $G = (V, E)$ is an undirected graph, and $C \subseteq V$. For every $v \in V$, we denote $I_r(G;v) = \{ u \in C: d(u,v) \leq r\}$, where $d(u,v)$ denotes the number of edges on any shortest path from $u$ to $v$. If all the sets $I_r(G;v)$ for $v \in V$ are pairwise different, and none of them is the empty set, the code $C$ is called $r$-identifying. If $C$ is $r$-identifying in all graphs $G'$ that can be obtained from $G$ by deleting at most $t$ edges, we say that $C$ is robust against $t$ known edge deletions. Codes that are robust against $t$ unknown edge deletions form a related class. We study these two classes of codes in the king grid with the vertex set ${\Bbb Z}^2$ where two different vertices are adjacent if their Euclidean distance is at most $\sqrt{2}$.


2016 ◽  
Vol 25 (05) ◽  
pp. 1640002 ◽  
Author(s):  
Jan Gaura ◽  
Eduard Sojka

Measuring the distance is an important task in many clustering and image-segmentation algorithms. The value of the distance decides whether two image points belong to a single or, respectively, to two different image segments. The Euclidean distance is used quite often. In more complicated cases, measuring the distances along the surface that is defined by the image function may be more appropriate. The geodesic distance, i.e. the shortest path in the corresponding graph, has become popular in this context. The problem is that it is determined on the basis of only one path that can be viewed as infinitely thin and that can arise accidentally as a result of imperfections in the image. Considering the k shortest paths can be regarded as an effort towards the measurement of the distance that is more reliable. The drawback remains that measuring the distance along several paths is burdened with the same problems as the original geodesic distance. Therefore, it does not guarantee significantly better results. In addition to this, the approach is computationally demanding. This paper introduces the resistance-geodesic distance with the goal to reduce the possibility of using a false accidental path for measurement. The approach can be briefly characterised in such a way that the path of a certain chosen width is sought for, which is in contrast to the geodesic distance. Firstly, the effective conductance is computed for each pair of the neighbouring nodes to determine the local width of the path that could possibly run through the arc connecting them. The width computed in this way is then used for determining the costs of arcs; the arcs whose use would lead to a small width of the final path are penalised. The usual methods for computing the shortest path in a graph are then used to compute the final distances. The corresponding theory and the experimental results are presented in this paper.


2021 ◽  
Vol 82 (1-2) ◽  
Author(s):  
Lena Collienne ◽  
Alex Gavryushkin

AbstractMany popular algorithms for searching the space of leaf-labelled (phylogenetic) trees are based on tree rearrangement operations. Under any such operation, the problem is reduced to searching a graph where vertices are trees and (undirected) edges are given by pairs of trees connected by one rearrangement operation (sometimes called a move). Most popular are the classical nearest neighbour interchange, subtree prune and regraft, and tree bisection and reconnection moves. The problem of computing distances, however, is $${\mathbf {N}}{\mathbf {P}}$$ N P -hard in each of these graphs, making tree inference and comparison algorithms challenging to design in practice. Although anked phylogenetic trees are one of the central objects of interest in applications such as cancer research, immunology, and epidemiology, the computational complexity of the shortest path problem for these trees remained unsolved for decades. In this paper, we settle this problem for the ranked nearest neighbour interchange operation by establishing that the complexity depends on the weight difference between the two types of tree rearrangements (rank moves and edge moves), and varies from quadratic, which is the lowest possible complexity for this problem, to $${\mathbf {N}}{\mathbf {P}}$$ N P -hard, which is the highest. In particular, our result provides the first example of a phylogenetic tree rearrangement operation for which shortest paths, and hence the distance, can be computed efficiently. Specifically, our algorithm scales to trees with tens of thousands of leaves (and likely hundreds of thousands if implemented efficiently).


Algorithms ◽  
2021 ◽  
Vol 14 (1) ◽  
pp. 21
Author(s):  
Christoph Hansknecht ◽  
Imke Joormann ◽  
Sebastian Stiller

The time-dependent traveling salesman problem (TDTSP) asks for a shortest Hamiltonian tour in a directed graph where (asymmetric) arc-costs depend on the time the arc is entered. With traffic data abundantly available, methods to optimize routes with respect to time-dependent travel times are widely desired. This holds in particular for the traveling salesman problem, which is a corner stone of logistic planning. In this paper, we devise column-generation-based IP methods to solve the TDTSP in full generality, both for arc- and path-based formulations. The algorithmic key is a time-dependent shortest path problem, which arises from the pricing problem of the column generation and is of independent interest—namely, to find paths in a time-expanded graph that are acyclic in the underlying (non-expanded) graph. As this problem is computationally too costly, we price over the set of paths that contain no cycles of length k. In addition, we devise—tailored for the TDTSP—several families of valid inequalities, primal heuristics, a propagation method, and a branching rule. Combining these with the time-dependent shortest path pricing we provide—to our knowledge—the first elaborate method to solve the TDTSP in general and with fully general time-dependence. We also provide for results on complexity and approximability of the TDTSP. In computational experiments on randomly generated instances, we are able to solve the large majority of small instances (20 nodes) to optimality, while closing about two thirds of the remaining gap of the large instances (40 nodes) after one hour of computation.


1986 ◽  
Vol 9 (1) ◽  
pp. 85-94
Author(s):  
Robert Endre Tarjan

Many linear-time graph algorithms using depth-first search have been invented. We propose simplified versions of two such algorithms, for computing a bipolar orientation or st-numbering of an undirected graph and for finding all feedback vertices of a directed graph.


2002 ◽  
Vol 12 (03) ◽  
pp. 249-261 ◽  
Author(s):  
XUEHOU TAN

Let π(a,b) denote the shortest path between two points a, b inside a simple polygon P, which totally lies in P. The geodesic distance between a and b in P is defined as the length of π(a,b), denoted by gd(a,b), in contrast with the Euclidean distance between a and b in the plane, denoted by d(a,b). Given two disjoint polygons P and Q in the plane, the bridge problem asks for a line segment (optimal bridge) that connects a point p on the boundary of P and a point q on the boundary of Q such that the sum of three distances gd(p′,p), d(p,q) and gd(q,q′), with any p′ ∈ P and any q′ ∈ Q, is minimized. We present an O(n log 3 n) time algorithm for finding an optimal bridge between two simple polygons. This significantly improves upon the previous O(n2) time bound. Our result is obtained by making substantial use of a hierarchical structure that consists of segment trees, range trees and persistent search trees, and a structure that supports dynamic ray shooting and shortest path queries as well.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Shumpei Haginoya ◽  
Aiko Hanayama ◽  
Tamae Koike

Purpose The purpose of this paper was to compare the accuracy of linking crimes using geographical proximity between three distance measures: Euclidean (distance measured by the length of a straight line between two locations), Manhattan (distance obtained by summing north-south distance and east-west distance) and the shortest route distances. Design/methodology/approach A total of 194 cases committed by 97 serial residential burglars in Aomori Prefecture in Japan between 2004 and 2015 were used in the present study. The Mann–Whitney U test was used to compare linked (two offenses committed by the same offender) and unlinked (two offenses committed by different offenders) pairs for each distance measure. Discrimination accuracy between linked and unlinked crime pairs was evaluated using area under the receiver operating characteristic curve (AUC). Findings The Mann–Whitney U test showed that the distances of the linked pairs were significantly shorter than those of the unlinked pairs for all distance measures. Comparison of the AUCs showed that the shortest route distance achieved significantly higher accuracy compared with the Euclidean distance, whereas there was no significant difference between the Euclidean and the Manhattan distance or between the Manhattan and the shortest route distance. These findings give partial support to the idea that distance measures taking the impact of environmental factors into consideration might be able to identify a crime series more accurately than Euclidean distances. Research limitations/implications Although the results suggested a difference between the Euclidean and the shortest route distance, it was small, and all distance measures resulted in outstanding AUC values, probably because of the ceiling effects. Further investigation that makes the same comparison in a narrower area is needed to avoid this potential inflation of discrimination accuracy. Practical implications The shortest route distance might contribute to improving the accuracy of crime linkage based on geographical proximity. However, further investigation is needed to recommend using the shortest route distance in practice. Given that the targeted area in the present study was relatively large, the findings may contribute especially to improve the accuracy of proactive comparative case analysis for estimating the whole picture of the distribution of serial crimes in the region by selecting more effective distance measure. Social implications Implications to improve the accuracy in linking crimes may contribute to assisting crime investigations and the earlier arrest of offenders. Originality/value The results of the present study provide an initial indication of the efficacy of using distance measures taking environmental factors into account.


Author(s):  
Qibin Zhou ◽  
Qinggang Su ◽  
Peng Xiong

The assisted download is an effective method solving the problem that the coverage range is insufficient when Wi-Fi access is used in VANET. For the low utilization of time-space resource within blind area and unbalanced download services in VANET, this paper proposes an approximate global optimum scheme to select vehicle based on WebGIS for assistance download. For WebGIS, this scheme uses a two-dimensional matrix to respectively define the time-space resource and the vehicle selecting behavior, and uses Markov Decision Process to solve the problem of time-space resource allocation within blind area, and utilizes the communication features of VANET to simplify the behavior space of vehicle selection so as to reduce the computing complexity. At the same time, Euclidean Distance(Metric) and Manhattan Distance are used as the basis of vehicle selection by the proposed scheme so that, in the case of possessing the balanced assisted download services, the target vehicles can increase effectively the total amount of user downloads. Experimental results show that because of the wider access range and platform independence of WebGIS, when user is in the case of relatively balanced download services, the total amount of downloads is increased by more than 20%. Moreover, WebGIS usually only needs to use Web browser (sometimes add some plug-ins) on the client side, so the system cost is greatly reduced.


2019 ◽  
Author(s):  
Nate Wessel ◽  
Steven Farber

Estimates of travel time by public transit often rely on the calculation of a shortest-path between two points for a given departure time. Such shortest-paths are time-dependent and not always stable from one moment to the next. Given that actual transit passengers necessarily have imperfect information about the system, their route selection strategies are heuristic and cannot be expected to achieve optimal travel times for all possible departures. Thus an algorithm that returns optimal travel times at all moments will tend to underestimate real travel times all else being equal. While several researchers have noted this issue none have yet measured the extent of the problem. This study observes and measures this effect by contrasting two alternative heuristic routing strategies to a standard shortest-path calculation. The Toronto Transit Commission is used as a case study and we model actual transit operations for the agency over the course of a normal week with archived AVL data transformed into a retrospective GTFS dataset. Travel times are estimated using two alternative route-choice assumptions: 1) habitual selection of the itinerary with the best average travel time and 2) dynamic choice of the next-departing route in a predefined choice set. It is shown that most trips present passengers with a complex choice among competing itineraries and that the choice of itinerary at any given moment of departure may entail substantial travel time risk relative to the optimal outcome. In the context of accessibility modelling, where travel times are typically considered as a distribution, the optimal path method is observed in aggregate to underestimate travel time by about 3-4 minutes at the median and 6-7 minutes at the \nth{90} percentile for a typical trip.


Sign in / Sign up

Export Citation Format

Share Document