Implementation by Vote-Buying Mechanisms

2021 ◽  
Vol 111 (9) ◽  
pp. 2811-2828
Author(s):  
Jon X. Eguia ◽  
Dimitrios Xefteris

Vote-buying mechanisms allow agents to express any level of support for their preferred alternative at an increasing cost. Focusing on large societies with wealth inequality, we prove that the family of binary social choice rules implemented by well-behaved vote-buying mechanisms is indexed by a single parameter, which determines the importance assigned to the agents’ willingness to pay to affect outcomes and to the number of supporters for each alternative. This parameter depends solely on the elasticity of the cost function near its origin: as this elasticity decreases, the intensities of support matter relatively more for outcomes than the supporters’ count. (JEL D63, D71, D72)

2019 ◽  
Vol 20 (01) ◽  
pp. 1950014
Author(s):  
Noam Greenberg ◽  
Joseph S. Miller ◽  
André Nies

We study the sets that are computable from both halves of some (Martin–Löf) random sequence, which we call [Formula: see text]-bases. We show that the collection of such sets forms an ideal in the Turing degrees that is generated by its c.e. elements. It is a proper subideal of the [Formula: see text]-trivial sets. We characterize [Formula: see text]-bases as the sets computable from both halves of Chaitin’s [Formula: see text], and as the sets that obey the cost function [Formula: see text]. Generalizing these results yields a dense hierarchy of subideals in the [Formula: see text]-trivial degrees: For [Formula: see text], let [Formula: see text] be the collection of sets that are below any [Formula: see text] out of [Formula: see text] columns of some random sequence. As before, this is an ideal generated by its c.e. elements and the random sequence in the definition can always be taken to be [Formula: see text]. Furthermore, the corresponding cost function characterization reveals that [Formula: see text] is independent of the particular representation of the rational [Formula: see text], and that [Formula: see text] is properly contained in [Formula: see text] for rational numbers [Formula: see text]. These results are proved using a generalization of the Loomis–Whitney inequality, which bounds the measure of an open set in terms of the measures of its projections. The generality allows us to analyze arbitrary families of orthogonal projections. As it turns out, these do not give us new subideals of the [Formula: see text]-trivial sets; we can calculate from the family which [Formula: see text] it characterizes. We finish by studying the union of [Formula: see text] for [Formula: see text]; we prove that this ideal consists of the sets that are robustly computable from some random sequence. This class was previously studied by Hirschfeldt [D. R. Hirschfeldt, C. G. Jockusch, R. Kuyper and P. E. Schupp, Coarse reducibility and algorithmic randomness, J. Symbolic Logic 81(3) (2016) 1028–1046], who showed that it is a proper subclass of the [Formula: see text]-trivial sets. We prove that all such sets are robustly computable from [Formula: see text], and that they form a proper subideal of the sets computable from every (weakly) LR-hard random sequence. We also show that the ideal cannot be characterized by a cost function, giving the first such example of a [Formula: see text] subideal of the [Formula: see text]-trivial sets.


2020 ◽  
Vol 34 (02) ◽  
pp. 2079-2086
Author(s):  
David Kempe

Distortion-based analysis has established itself as a fruitful framework for comparing voting mechanisms. m voters and n candidates are jointly embedded in an (unknown) metric space, and the voters submit rankings of candidates by non-decreasing distance from themselves. Based on the submitted rankings, the social choice rule chooses a winning candidate; the quality of the winner is the sum of the (unknown) distances to the voters. The rule's choice will in general be suboptimal, and the worst-case ratio between the cost of its chosen candidate and the optimal candidate is called the rule's distortion. It was shown in prior work that every deterministic rule has distortion at least 3, while the Copeland rule and related rules guarantee distortion at most 5; a very recent result gave a rule with distortion 2 + √5 ≈ 4.236.We provide a framework based on LP-duality and flow interpretations of the dual which provides a simpler and more unified way for proving upper bounds on the distortion of social choice rules. We illustrate the utility of this approach with three examples. First, we show that the Ranked Pairs and Schulze rules have distortion Θ(√n). Second, we give a fairly simple proof of a strong generalization of the upper bound of 5 on the distortion of Copeland, to social choice rules with short paths from the winning candidate to the optimal candidate in generalized weak preference graphs. A special case of this result recovers the recent 2 + √5 guarantee. Finally, our framework naturally suggests a combinatorial rule that is a strong candidate for achieving distortion 3, which had also been proposed in recent work. We prove that the distortion bound of 3 would follow from any of three combinatorial conjectures we formulate.


Author(s):  
Pham Thi Thu Ha ◽  
Phan Dieu Huong

Underground power grid projects in Hanoi is so urgent that it requires immediate implementation. To synchronously and quickly implement the underground power grid projects, people in charge should not follow the outdated perspectives of just including the power industry, but also need to call for the support and cost sharing responsibility from consumers. This paper aims at approaching the subject both from the producers and consumers’ perspectives to together sharing the cost of putting the power grid underground not only in Hanoi but other metropolitans in Vietnam as well. Field studies (including 104 families) at Hoan Kiem District, Hanoi and CBA method were applied to investigate the willingness to pay (WTP) level of consumers to share the cost with the power industry for the underground power grid projects in Hanoi. The overview of the results shows that cost for the underground power grid in Hoan Kiem District ranging from 30,000 VND/household/month to 46,000VND/household/month. On the other hand, the willingness to pay of a typical household of four people within Hoan Kiem District ranges from 17,000VND/month to 24,000VND/month, with the most favorable method of annual payment within a detailed timeline.


SLEEP ◽  
2021 ◽  
Vol 44 (Supplement_2) ◽  
pp. A177-A177
Author(s):  
Jaejin An ◽  
Dennis Hwang ◽  
Jiaxiao Shi ◽  
Amy Sawyer ◽  
Aiyu Chen ◽  
...  

Abstract Introduction Trial-based tele-obstructive sleep apnea (OSA) cost-effectiveness analyses have often been inconclusive due to small sample sizes and short follow-up. In this study, we report the cost-effectiveness of Tele-OSA using a larger sample from a 3-month trial that was augmented with 2.75 additional years of epidemiologic follow-up. Methods The Tele-OSA study was a 3-month randomized trial conducted in Kaiser Permanente Southern California that demonstrated improved adherence in patients receiving automated feedback messaging regarding their positive airway pressure (PAP) use when compared to usual care. At the end of the 3 months, participants in the intervention group pseudo-randomly either stopped or continued receiving messaging. This analysis included those participants who had moderate-severe OSA (Apnea Hypopnea Index >=15) and compared the cost-effectiveness of 3 groups: 1) no messaging, 2) messaging for 3 months only, and 3) messaging for 3 years. Costs were derived by multiplying medical service use from electronic medical records times costs from Federal fee schedules. Effects were average nightly hours of PAP use. We report the incremental cost per incremental hour of PAP use as well as the fraction acceptable. Results We included 256 patients with moderate-severe OSA (Group 1, n=132; Group 2, n=79; Group 3, n=45). Group 2, which received the intervention for 3 months only, had the highest costs and fewest hours of use and was dominated by the other two groups. Average 1-year costs for groups 1 and 3 were $6035 (SE, $477) and $6154 (SE, $575), respectively; average nightly hours of PAP use were 3.07 (SE, 0.23) and 4.09 (SE, 0.42). Compared to no messaging, messaging for 3 years had an incremental cost ($119, p=0.86) per incremental hour of use (1.02, p=0.03) of $117. For a willingness-to-pay (WTP) of $500 per year ($1.37/night), 3-year messaging has a 70% chance of being acceptable. Conclusion Long-term Tele-OSA messaging was more effective than no messaging for PAP use outcomes but also highly likely cost-effective with an acceptable willingness-to-pay threshold. Epidemiologic evidence suggests that this greater use will yield both clinical and additional economic benefits. Support (if any) Tele-OSA study was supported by the AASM Foundation SRA Grant #: 104-SR-13


2021 ◽  
Vol 11 (2) ◽  
pp. 850
Author(s):  
Dokkyun Yi ◽  
Sangmin Ji ◽  
Jieun Park

Artificial intelligence (AI) is achieved by optimizing the cost function constructed from learning data. Changing the parameters in the cost function is an AI learning process (or AI learning for convenience). If AI learning is well performed, then the value of the cost function is the global minimum. In order to obtain the well-learned AI learning, the parameter should be no change in the value of the cost function at the global minimum. One useful optimization method is the momentum method; however, the momentum method has difficulty stopping the parameter when the value of the cost function satisfies the global minimum (non-stop problem). The proposed method is based on the momentum method. In order to solve the non-stop problem of the momentum method, we use the value of the cost function to our method. Therefore, as the learning method processes, the mechanism in our method reduces the amount of change in the parameter by the effect of the value of the cost function. We verified the method through proof of convergence and numerical experiments with existing methods to ensure that the learning works well.


2020 ◽  
Vol 18 (02) ◽  
pp. 2050006 ◽  
Author(s):  
Alexsandro Oliveira Alexandrino ◽  
Carla Negri Lintzmayer ◽  
Zanoni Dias

One of the main problems in Computational Biology is to find the evolutionary distance among species. In most approaches, such distance only involves rearrangements, which are mutations that alter large pieces of the species’ genome. When we represent genomes as permutations, the problem of transforming one genome into another is equivalent to the problem of Sorting Permutations by Rearrangement Operations. The traditional approach is to consider that any rearrangement has the same probability to happen, and so, the goal is to find a minimum sequence of operations which sorts the permutation. However, studies have shown that some rearrangements are more likely to happen than others, and so a weighted approach is more realistic. In a weighted approach, the goal is to find a sequence which sorts the permutations, such that the cost of that sequence is minimum. This work introduces a new type of cost function, which is related to the amount of fragmentation caused by a rearrangement. We present some results about the lower and upper bounds for the fragmentation-weighted problems and the relation between the unweighted and the fragmentation-weighted approach. Our main results are 2-approximation algorithms for five versions of this problem involving reversals and transpositions. We also give bounds for the diameters concerning these problems and provide an improved approximation factor for simple permutations considering transpositions.


2005 ◽  
Vol 133 (6) ◽  
pp. 1710-1726 ◽  
Author(s):  
Milija Zupanski

Abstract A new ensemble-based data assimilation method, named the maximum likelihood ensemble filter (MLEF), is presented. The analysis solution maximizes the likelihood of the posterior probability distribution, obtained by minimization of a cost function that depends on a general nonlinear observation operator. The MLEF belongs to the class of deterministic ensemble filters, since no perturbed observations are employed. As in variational and ensemble data assimilation methods, the cost function is derived using a Gaussian probability density function framework. Like other ensemble data assimilation algorithms, the MLEF produces an estimate of the analysis uncertainty (e.g., analysis error covariance). In addition to the common use of ensembles in calculation of the forecast error covariance, the ensembles in MLEF are exploited to efficiently calculate the Hessian preconditioning and the gradient of the cost function. A sufficient number of iterative minimization steps is 2–3, because of superior Hessian preconditioning. The MLEF method is well suited for use with highly nonlinear observation operators, for a small additional computational cost of minimization. The consistent treatment of nonlinear observation operators through optimization is an advantage of the MLEF over other ensemble data assimilation algorithms. The cost of MLEF is comparable to the cost of existing ensemble Kalman filter algorithms. The method is directly applicable to most complex forecast models and observation operators. In this paper, the MLEF method is applied to data assimilation with the one-dimensional Korteweg–de Vries–Burgers equation. The tested observation operator is quadratic, in order to make the assimilation problem more challenging. The results illustrate the stability of the MLEF performance, as well as the benefit of the cost function minimization. The improvement is noted in terms of the rms error, as well as the analysis error covariance. The statistics of innovation vectors (observation minus forecast) also indicate a stable performance of the MLEF algorithm. Additional experiments suggest the amplified benefit of targeted observations in ensemble data assimilation.


2013 ◽  
Vol 70 (3) ◽  
pp. 279-312
Author(s):  
Rosa Camps ◽  
Xavier Mora ◽  
Laia Saumell

Sign in / Sign up

Export Citation Format

Share Document