scholarly journals On the complexity of the balanced vertex ordering problem

2007 ◽  
Vol Vol. 9 no. 1 (Graph and Algorithms) ◽  
Author(s):  
Jan Kára ◽  
Jan Kratochvil ◽  
David R. Wood

Graphs and Algorithms International audience We consider the problem of finding a balanced ordering of the vertices of a graph. More precisely, we want to minimise the sum, taken over all vertices v, of the difference between the number of neighbours to the left and right of v. This problem, which has applications in graph drawing, was recently introduced by Biedl et al. [Discrete Applied Math. 148:27―48, 2005]. They proved that the problem is solvable in polynomial time for graphs with maximum degree three, but NP-hard for graphs with maximum degree six. One of our main results is to close the gap in these results, by proving NP-hardness for graphs with maximum degree four. Furthermore, we prove that the problem remains NP-hard for planar graphs with maximum degree four and for 5-regular graphs. On the other hand, we introduce a polynomial time algorithm that determines whetherthere is a vertex ordering with total imbalance smaller than a fixed constant, and a polynomial time algorithm that determines whether a given multigraph with even degrees has an 'almost balanced' ordering.

2021 ◽  
Vol vol. 23 no. 1 (Graph Theory) ◽  
Author(s):  
Niels Grüttemeier ◽  
Christian Komusiewicz ◽  
Jannik Schestag ◽  
Frank Sommer

We introduce and study the Bicolored $P_3$ Deletion problem defined as follows. The input is a graph $G=(V,E)$ where the edge set $E$ is partitioned into a set $E_r$ of red edges and a set $E_b$ of blue edges. The question is whether we can delete at most $k$ edges such that $G$ does not contain a bicolored $P_3$ as an induced subgraph. Here, a bicolored $P_3$ is a path on three vertices with one blue and one red edge. We show that Bicolored $P_3$ Deletion is NP-hard and cannot be solved in $2^{o(|V|+|E|)}$ time on bounded-degree graphs if the ETH is true. Then, we show that Bicolored $P_3$ Deletion is polynomial-time solvable when $G$ does not contain a bicolored $K_3$, that is, a triangle with edges of both colors. Moreover, we provide a polynomial-time algorithm for the case that $G$ contains no blue $P_3$, red $P_3$, blue $K_3$, and red $K_3$. Finally, we show that Bicolored $P_3$ Deletion can be solved in $ O(1.84^k\cdot |V| \cdot |E|)$ time and that it admits a kernel with $ O(k\Delta\min(k,\Delta))$ vertices, where $\Delta$ is the maximum degree of $G$. Comment: 25 pages


10.29007/v68w ◽  
2018 ◽  
Author(s):  
Ying Zhu ◽  
Mirek Truszczynski

We study the problem of learning the importance of preferences in preference profiles in two important cases: when individual preferences are aggregated by the ranked Pareto rule, and when they are aggregated by positional scoring rules. For the ranked Pareto rule, we provide a polynomial-time algorithm that finds a ranking of preferences such that the ranked profile correctly decides all the examples, whenever such a ranking exists. We also show that the problem to learn a ranking maximizing the number of correctly decided examples (also under the ranked Pareto rule) is NP-hard. We obtain similar results for the case of weighted profiles when positional scoring rules are used for aggregation.


2014 ◽  
Vol 24 (03) ◽  
pp. 225-236 ◽  
Author(s):  
DAVID KIRKPATRICK ◽  
BOTING YANG ◽  
SANDRA ZILLES

Given an arrangement A of n sensors and two points s and t in the plane, the barrier resilience of A with respect to s and t is the minimum number of sensors whose removal permits a path from s to t such that the path does not intersect the coverage region of any sensor in A. When the surveillance domain is the entire plane and sensor coverage regions are unit line segments, even with restricted orientations, the problem of determining the barrier resilience is known to be NP-hard. On the other hand, if sensor coverage regions are arbitrary lines, the problem has a trivial linear time solution. In this paper, we study the case where each sensor coverage region is an arbitrary ray, and give an O(n2m) time algorithm for computing the barrier resilience when there are m ⩾ 1 sensor intersections.


2020 ◽  
Vol 34 (02) ◽  
pp. 2070-2078
Author(s):  
Yasushi Kawase ◽  
Hanna Sumita

We study the problem of fairly allocating a set of indivisible goods to risk-neutral agents in a stochastic setting. We propose an (approximation) algorithm to find a stochastic allocation that maximizes the minimum utility among the agents. The algorithm runs by repeatedly finding an (approximate) allocation to maximize the total virtual utility of the agents. This implies that the problem is solvable in polynomial time when the utilities are gross-substitutes (which is a subclass of submodular). When the utilities are submodular, we can find a (1 − 1/e)-approximate solution for the problem and this is best possible unless P=NP. We also extend the problem where a stochastic allocation must satisfy the (ex ante) envy-freeness. Under this condition, we demonstrate that the problem is NP-hard even when every agent has an additive utility with a matroid constraint (which is a subclass of gross-substitutes). Furthermore, we propose a polynomial-time algorithm for the setting with a restriction that the matroid constraint is common to all agents.


2021 ◽  
Vol 13 (4) ◽  
pp. 1-24
Author(s):  
Jessica Chen ◽  
Henry Milner ◽  
Ion Stoica ◽  
Jibin Zhan

The HTTP adaptive streaming technique opened the door to cope with the fluctuating network conditions during the streaming process by dynamically adjusting the volume of the future chunks to be downloaded. The bitrate selection in this adjustment inevitably involves the task of predicting the future throughput of a video session, owing to which various heuristic solutions have been explored. The ultimate goal of the present work is to explore the theoretical upper bounds of the QoE that any ABR algorithm can possibly reach, therefore providing an essential step to benchmarking the performance evaluation of ABR algorithms. In our setting, the QoE is defined in terms of a linear combination of the average perceptual quality and the buffering ratio. The optimization problem is proven to be NP-hard when the perceptual quality is defined by chunk size and conditions are given under which the problem becomes polynomially solvable. Enriched by a global lower bound, a pseudo-polynomial time algorithm along the dynamic programming approach is presented. When the minimum buffering is given higher priority over higher perceptual quality, the problem is shown to be also NP-hard, and the above algorithm is simplified and enhanced by a sequence of lower bounds on the completion time of chunk downloading, which, according to our experiment, brings a 36.0% performance improvement in terms of computation time. To handle large amounts of data more efficiently, a polynomial-time algorithm is also introduced to approximate the optimal values when minimum buffering is prioritized. Besides its performance guarantee, this algorithm is shown to reach 99.938% close to the optimal results, while taking only 0.024% of the computation time compared to the exact algorithm in dynamic programming.


1996 ◽  
Vol 07 (01) ◽  
pp. 23-41
Author(s):  
MARTIN FÜRER ◽  
WEBB MILLER

An alignment of k given sequences is a k-rowed matrix frequently used by molecular biologists to display correspondences between entries from each sequence. Under one approach, an alignment is represented by a matrix of ‘x’ and ’-’ characters, where each x in row r indicates the position of an entry of sequence r. It is sometimes efficient to store only the run-length encoding of each row of this bit-matrix. A natural class of commands for editing one such row into another consists of operations of the form: “Move the d dashes that begin at position i of row r to position j of that row,” for relevant values of r, d, i and j. We show that the problem of determining a shortest sequence of such operations that converts one given alignment to another is NP-hard and give a polynomial-time algorithm that always comes within a factor 5/4 of optimality. An application of these ideas to alignments of long DNA sequences is discussed.


1998 ◽  
Vol 7 (4) ◽  
pp. 375-386 ◽  
Author(s):  
THOMAS EMDEN-WEINERT ◽  
STEFAN HOUGARDY ◽  
BERND KREUTER

For any integer k, we prove the existence of a uniquely k-colourable graph of girth at least g on at most k12(g+1) vertices whose maximal degree is at most 5k13. From this we deduce that, unless NP=RP, no polynomial time algorithm for k-Colourability on graphs G of girth g(G)[ges ]log[mid ]G[mid ]/13logk and maximum degree Δ(G)[les ]6k13 can exist. We also study several related problems.


2008 ◽  
Vol 17 (2) ◽  
pp. 265-270 ◽  
Author(s):  
H. A. KIERSTEAD ◽  
A. V. KOSTOCHKA

A proper vertex colouring of a graph is equitable if the sizes of colour classes differ by at most one. We present a new shorter proof of the celebrated Hajnal–Szemerédi theorem: for every positive integer r, every graph with maximum degree at most r has an equitable colouring with r+1 colours. The proof yields a polynomial time algorithm for such colourings.


2019 ◽  
Vol 28 (1) ◽  
pp. 1-13
Author(s):  
Abra Brisbin ◽  
Manda Riehl ◽  
Noah Williams

Abstract Permutations are frequently used in solving the genome rearrangement problem, whose goal is finding the shortest sequence of mutations transforming one genome into another. We introduce the Deletion-Insertion model (DI) to model small-scale mutations in species with linear chromosomes, such as humans. Applying one restriction to this model, we obtain the transposition model for genome rearrangement, which was shown to be NP-hard in [4]. We use combinatorial reasoning and permutation statistics to develop a polynomial-time algorithm to approximate the minimum number of transpositions required in the transposition model and to analyze the sharpness of several bounds on transpositions between genomes.


Author(s):  
Tobias Harks ◽  
Veerle Timmermans

AbstractWe study the equilibrium computation problem for two classical resource allocation games: atomic splittable congestion games and multimarket Cournot oligopolies. For atomic splittable congestion games with singleton strategies and player-specific affine cost functions, we devise the first polynomial time algorithm computing a pure Nash equilibrium. Our algorithm is combinatorial and computes the exact equilibrium assuming rational input. The idea is to compute an equilibrium for an associated integrally-splittable singleton congestion game in which the players can only split their demands in integral multiples of a common packet size. While integral games have been considered in the literature before, no polynomial time algorithm computing an equilibrium was known. Also for this class, we devise the first polynomial time algorithm and use it as a building block for our main algorithm. We then develop a polynomial time computable transformation mapping a multimarket Cournot competition game with firm-specific affine price functions and quadratic costs to an associated atomic splittable congestion game as described above. The transformation preserves equilibria in either game and, thus, leads – via our first algorithm – to a polynomial time algorithm computing Cournot equilibria. Finally, our analysis for integrally-splittable games implies new bounds on the difference between real and integral Cournot equilibria. The bounds can be seen as a generalization of the recent bounds for single market oligopolies obtained by Todd (Math Op Res 41(3):1125–1134 2016, 10.1287/moor.2015.0771).


Sign in / Sign up

Export Citation Format

Share Document