complexity dichotomy
Recently Published Documents


TOTAL DOCUMENTS

32
(FIVE YEARS 13)

H-INDEX

6
(FIVE YEARS 1)

2022 ◽  
Vol Volume 18, Issue 1 ◽  
Author(s):  
Antoine Amarilli ◽  
İsmail İlkan Ceylan

We study the problem of query evaluation on probabilistic graphs, namely, tuple-independent probabilistic databases over signatures of arity two. We focus on the class of queries closed under homomorphisms, or, equivalently, the infinite unions of conjunctive queries. Our main result states that the probabilistic query evaluation problem is #P-hard for all unbounded queries from this class. As bounded queries from this class are equivalent to a union of conjunctive queries, they are already classified by the dichotomy of Dalvi and Suciu (2012). Hence, our result and theirs imply a complete data complexity dichotomy, between polynomial time and #P-hardness, on evaluating homomorphism-closed queries over probabilistic graphs. This dichotomy covers in particular all fragments of infinite unions of conjunctive queries over arity-two signatures, such as negation-free (disjunctive) Datalog, regular path queries, and a large class of ontology-mediated queries. The dichotomy also applies to a restricted case of probabilistic query evaluation called generalized model counting, where fact probabilities must be 0, 0.5, or 1. We show the main result by reducing from the problem of counting the valuations of positive partitioned 2-DNF formulae, or from the source-to-target reliability problem in an undirected graph, depending on properties of minimal models for the query.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Xiangyan Liu ◽  
Jianhong Zheng ◽  
Meng Zhang ◽  
Yang Li ◽  
Rui Wang ◽  
...  

AbstractDevice-to-device (D2D) communications and mobile edge computing (MEC) used to resolve traffic overload problems is a trend in the cellular network. By jointly considering the computation capability and the maximum delay, resource-constrained terminals offload parts of their computation-intensive tasks to one nearby device via a D2D connection or an edge server deployed at a base station via a cellular connection. In this paper, a novel method of cellular D2D–MEC system is proposed, which enables task offloading and resource allocation meanwhile improving the execution efficiency of each device with a low latency. We consider the partial offloading strategy and divide the task into local and remote computing, both of which can be executed in parallel through different computational modes. Instead of allocating system resources from a macroscopic view, we innovatively study both the task offloading strategy and the computing efficiency of each device from a microscopic perspective. By taking both task offloading policy and computation resource allocation into consideration, the optimization problem is formulated as that of maximized computing efficiency. As the formulated problem is a mixed-integer non-linear problem, we thus propose a two-phase heuristic algorithm by jointly considering helper selection and computation resources allocation. In the first phase, we obtain the suboptimal helper selection policy. In the second phase, the MEC computation resources allocation strategy is achieved. The proposed low complexity dichotomy algorithm (LCDA) is used to match the subtask-helper pair. The simulation results demonstrate the superiority of the proposed D2D-enhanced MEC system over some traditional D2D–MEC algorithms.


2021 ◽  
Vol 13 (2) ◽  
pp. 1-25
Author(s):  
Jin-yi Cai ◽  
Artem Govorov

Graph homomorphism has been an important research topic since its introduction [20]. Stated in the language of binary relational structures in that paper [20], Lovász proved a fundamental theorem that, for a graph H given by its 0-1 valued adjacency matrix, the graph homomorphism function G ↦ hom( G , H ) determines the isomorphism type of H . In the past 50 years, various extensions have been proved by many researchers [1, 15, 21, 24, 26]. These extend the basic 0-1 case to admit vertex and edge weights; but these extensions all have some restrictions such as all vertex weights must be positive. In this article, we prove a general form of this theorem where H can have arbitrary vertex and edge weights. A noteworthy aspect is that we prove this by a surprisingly simple and unified argument. This bypasses various technical obstacles and unifies and extends all previous known versions of this theorem on graphs. The constructive proof of our theorem can be used to make various complexity dichotomy theorems for graph homomorphism effective in the following sense: it provides an algorithm that for any H either outputs a P-time algorithm solving hom(&sdot, H ) or a P-time reduction from a canonical #P-hard problem to hom(&sdot, H ).


2021 ◽  
pp. 2150020
Author(s):  
Manuel Bodirsky ◽  
Thomas Quinn-Gregson

We study the computational complexity of deciding whether a given set of term equalities and inequalities has a solution in an [Formula: see text]-categorical algebra [Formula: see text]. There are [Formula: see text]-categorical groups where this problem is undecidable. We show that if [Formula: see text] is an [Formula: see text]-categorical semilattice or an abelian group, then the problem is in P or NP-hard. The hard cases are precisely those where [Formula: see text] has a uniformly continuous minor-preserving map to the clone of projections on a two-element set. The results provide information about algebras [Formula: see text] such that [Formula: see text] does not satisfy this condition, and they are of independent interest in universal algebra. In our proofs we rely on the Barto–Pinsker theorem about the existence of pseudo-Siggers polymorphisms. To the best of our knowledge, this is the first time that the pseudo-Siggers identity has been used to prove a complexity dichotomy.


2020 ◽  
Vol 34 (02) ◽  
pp. 1846-1853
Author(s):  
Robert Bredereck ◽  
Andrzej Kaczmarczyk ◽  
Rolf Niedermeier

We introduce successive committees elections. The point is that our new model additionally takes into account that “committee members” shall have a short term of office possibly over a consecutive time period (e.g., to limit the influence of elitist power cartels or to keep the social costs of overloading committees as small as possible) but at the same time overly frequent elections are to be avoided (e.g., for the sake of long-term planning). Thus, given voter preferences over a set of candidates, a desired committee size, a number of committees to be elected, and an upper bound on the number of committees that each candidate can participate in, the goal is to find a “best possible” series of committees representing the electorate. We show a sharp complexity dichotomy between computing series of committees of size at most two (mostly in polynomial time) and of committees of size at least three (mostly NP-hard). Depending on the voting rule, however, even for larger committee sizes we can spot some tractable cases.


Sign in / Sign up

Export Citation Format

Share Document