scholarly journals Weighted Envy-freeness in Indivisible Item Allocation

2021 ◽  
Vol 9 (3) ◽  
pp. 1-39
Author(s):  
Mithun Chakraborty ◽  
Ayumi Igarashi ◽  
Warut Suksompong ◽  
Yair Zick

We introduce and analyze new envy-based fairness concepts for agents with weights that quantify their entitlements in the allocation of indivisible items. We propose two variants of weighted envy-freeness up to one item (WEF1): strong , where envy can be eliminated by removing an item from the envied agent’s bundle, and weak , where envy can be eliminated either by removing an item (as in the strong version) or by replicating an item from the envied agent’s bundle in the envying agent’s bundle. We show that for additive valuations, an allocation that is both Pareto optimal and strongly WEF1 always exists and can be computed in pseudo-polynomial time; moreover, an allocation that maximizes the weighted Nash social welfare may not be strongly WEF1, but it always satisfies the weak version of the property. Moreover, we establish that a generalization of the round-robin picking sequence algorithm produces in polynomial time a strongly WEF1 allocation for an arbitrary number of agents; for two agents, we can efficiently achieve both strong WEF1 and Pareto optimality by adapting the adjusted winner procedure. Our work highlights several aspects in which weighted fair division is richer and more challenging than its unweighted counterpart.

2020 ◽  
Vol 68 ◽  
pp. 225-245
Author(s):  
Peter McGlaughlin ◽  
Jugal Garg

We consider the problem of fairly allocating a set of indivisible goods among n agents. Various fairness notions have been proposed within the rapidly growing field of fair division, but the Nash social welfare (NSW) serves as a focal point. In part, this follows from the ‘unreasonable’ fairness guarantees provided, in the sense that a max NSW allocation meets multiple other fairness metrics simultaneously, all while satisfying a standard economic concept of efficiency, Pareto optimality. However, existing approximation algorithms fail to satisfy all of the remarkable fairness guarantees offered by a max NSW allocation, instead targeting only the specific NSW objective. We address this issue by presenting a 2 max NSW, Prop-1, 1/(2n) MMS, and Pareto optimal allocation in strongly polynomial time. Our techniques are based on a market interpretation of a fractional max NSW allocation. We present novel definitions of fairness concepts in terms of market prices, and design a new scheme to round a market equilibrium into an integral allocation in a way that provides most of the fairness properties of an integral max NSW allocation.


Author(s):  
Jugal Garg ◽  
Peter McGlaughlin

We consider the problem of fairly allocating a set of indivisible goods among n agents. Various fairness notions have been proposed within the rapidly growing field of fair division, but the Nash social welfare (NSW) serves as a focal point. In part, this follows from the 'unreasonable' fairness guarantees provided, in the sense that a max NSW allocation meets multiple other fairness metrics simultaneously, all while satisfying a standard economic concept of efficiency, Pareto optimality. However, existing approximation algorithms fail to satisfy all of the remarkable fairness guarantees offered by a max NSW allocation, instead targeting only the specific NSW objective. We address this issue by presenting a 2 max NSW, Prop-1, 1/(2n) MMS, and Pareto optimal allocation in strongly polynomial time. Our techniques are based on a market interpretation of a fractional max NSW allocation. We present novel definitions of fairness concepts in terms of market prices, and design a new scheme to round a market equilibrium into an integral allocation that provides most of the fairness properties of an integral max NSW allocation. 


Author(s):  
Nawal Benabbou ◽  
Mithun Chakraborty ◽  
Edith Elkind ◽  
Yair Zick

In this paper, we study the problem of matching a set of items to a set of agents partitioned into types so as to balance fairness towards the types against overall utility/efficiency. We extend multiple desirable properties of indivisible goods allocation to our model and investigate the possibility and hardness of achieving combinations of these properties, e.g. we prove that maximizing utilitarian social welfare under constraints of typewise envy-freeness up to one item (TEF1) is computationally intractable. We also define a new concept of waste for this setting, show experimentally that augmenting an existing algorithm with a marginal utility maximization heuristic can produce a TEF1 solution with reduced waste, and also provide a polynomial-time algorithm for computing a non-wasteful TEF1 allocation for binary agent-item utilities.


1986 ◽  
Vol 9 (3) ◽  
pp. 323-342
Author(s):  
Joseph Y.-T. Leung ◽  
Burkhard Monien

We consider the computational complexity of finding an optimal deadlock recovery. It is known that for an arbitrary number of resource types the problem is NP-hard even when the total cost of deadlocked jobs and the total number of resource units are “small” relative to the number of deadlocked jobs. It is also known that for one resource type the problem is NP-hard when the total cost of deadlocked jobs and the total number of resource units are “large” relative to the number of deadlocked jobs. In this paper we show that for one resource type the problem is solvable in polynomial time when the total cost of deadlocked jobs or the total number of resource units is “small” relative to the number of deadlocked jobs. For fixed m ⩾ 2 resource types, we show that the problem is solvable in polynomial time when the total number of resource units is “small” relative to the number of deadlocked jobs. On the other hand, when the total number of resource units is “large”, the problem becomes NP-hard even when the total cost of deadlocked jobs is “small” relative to the number of deadlocked jobs. The results in the paper, together with previous known ones, give a complete delineation of the complexity of this problem under various assumptions of the input parameters.


2021 ◽  
Vol 9 (2) ◽  
pp. 1-19
Author(s):  
Z. Li ◽  
A. Vetta

We consider the fair division of indivisible items using the maximin shares measure. Recent work on the topic has focused on extending results beyond the class of additive valuation functions. In this spirit, we study the case where the items form a hereditary set system. We present a simple algorithm that allocates each agent a bundle of items whose value is at least 0.3666 times the maximin share of the agent. This improves upon the current best known guarantee of 0.2 due to Ghodsi et al. The analysis of the algorithm is almost tight; we present an instance where the algorithm provides a guarantee of at most 0.3738. We also show that the algorithm can be implemented in polynomial time given a valuation oracle for each agent.


2020 ◽  
Vol 55 (3) ◽  
pp. 523-545 ◽  
Author(s):  
Xiaohui Bei ◽  
Guangda Huzhang ◽  
Warut Suksompong

Abstract We study the problem of fairly dividing a heterogeneous resource, commonly known as cake cutting and chore division, in the presence of strategic agents. While a number of results in this setting have been established in previous works, they rely crucially on the free disposal assumption, meaning that the mechanism is allowed to throw away part of the resource at no cost. In the present work, we remove this assumption and focus on mechanisms that always allocate the entire resource. We exhibit a truthful and envy-free mechanism for cake cutting and chore division for two agents with piecewise uniform valuations, and we complement our result by showing that such a mechanism does not exist when certain additional constraints are imposed on the mechanisms. Moreover, we provide bounds on the efficiency of mechanisms satisfying various properties, and give truthful mechanisms for multiple agents with restricted classes of valuations.


1973 ◽  
Vol 95 (4) ◽  
pp. 356-361 ◽  
Author(s):  
G. Leitmann ◽  
W. Schmitendorf

We consider the optimal control problem with vector-valued criterion (including cooperative games) and seek Pareto-optimal (noninferior) solutions. Scalarization results, together with modified sufficiency theorems from optimal control theory, are used to deduce sufficient conditions for Pareto-optimality. The utilization of these conditions is illustrated by various examples.


2019 ◽  
Author(s):  
Aba Szollosi ◽  
David Kellen ◽  
Danielle Navarro ◽  
Rich Shiffrin ◽  
Iris van Rooij ◽  
...  

Proponents of preregistration argue that, among other benefits, it improves the diagnosticity of statistical tests [1]. In the strong version of this argument, preregistration does this by solving statistical problems, such as family-wise error rates. In the weak version, it nudges people to think more deeply about their theories, methods, and analyses. We argue against both: the diagnosticity of statistical tests depend entirely on how well statistical models map onto underlying theories, and so improving statistical techniques does little to improve theories when the mapping is weak. There is also little reason to expect that preregistration will spontaneously help researchers to develop better theories (and, hence, better methods and analyses).


Author(s):  
Carleilton Severino Silva

Since 1742, the year in which the Prussian Christian Goldbach wrote a letter to Leonhard Euler with his Conjecture in the weak version, mathematicians have been working on the problem. The tools in number theory become the most sophisticated thanks to the resolution solutions. Euler himself said he was unable to prove it. The weak guess in the modern version states the following: any odd number greater than 5 can be written as the sum of 3 primes. In response to Goldbach's letter, Euler reminded him of a conversation in which he proposed what is now known as Goldbach's strong conjecture: any even number greater than 2 can be written as a sum of 2 prime numbers. The most interesting result came in 2013, with proof of weak version by the Peruvian Mathematician Harald Helfgott, however the strong version remained without a definitive proof. The weak version can be demonstrated without major difficulties and will not be described in this article, as it becomes a corollary of the strong version. Despite the enormous intellectual baggage that great mathematicians have had over the centuries, the Conjecture in question has not been validated or refuted until today.


1970 ◽  
Vol 18 (1) ◽  
pp. 73-92
Author(s):  
Yishai A. Cohen

In this paper I articulate and defend a new anti-theodicy challenge to Skeptical Theism. More specifically, I defend the Threshold Problem according to which there is a threshold to the kinds of evils that are in principle justifiable for God to permit, and certain instances of evil are beyond that threshold. I further argue that Skeptical Theism does not have the resources to adequately rebut the Threshold Problem. I argue for this claim by drawing a distinction between a weak and strong version of Skeptical Theism, such that the strong version must be defended in order to rebut the Threshold Problem. However, the skeptical theist’s appeal to our limited cognitive faculties only supports the weak version.


Sign in / Sign up

Export Citation Format

Share Document