A smooth social choice method of preference aggregation

Author(s):  
Norman Schofield
Author(s):  
G. Bingham Powell

This article talks about aggregating and representing political preferences. It presents the challenge of social choice analysis and identifies the conditions for representative democracy and preference aggregation. Multiple issue congruence, vote correspondence, and single dimensional issue congruence, which are the three major approaches to the comparative study of preference aggregation, are discussed as well.


Author(s):  
Nicholas R. Miller

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Politics. Please check back later for the full article.Narrowly understood, social choice theory is a specialized branch of applied logic and mathematics that analyzes abstract objects called preference aggregation functions, social welfare functions, and social choice functions. But more broadly, social choice theory identifies, analyzes, and evaluates rules that may be used to make collective decisions. So understood, social choice is a subfield of the social sciences that examines what may be called “voting rules” of various sorts. While social choice theory typically assumes a finite set of alternatives over which voter preferences are unrestricted, the spatial model of social choice assumes that policy alternatives can be represented by points in a space of one or more dimensions, and that voters have preferences that are plausibly shaped by this spatial structure.Social choice theory has considerable relevance for the study of legislative (as well as electoral) institutions. The concepts and tools of social choice theory make possible formal descriptions of legislative institutions such as bicameralism, parliamentary voting procedures, effects of decision rules (e.g., supramajority vs. simple majority rule and executive veto rules), sincere vs. strategic voting by legislators, agenda control, and other parliamentary maneuvers. Spatial models of social choice further enrich this analysis and raise additional questions regarding policy stability and change. Spatial models are used increasingly to guide empirical research on legislative institutions and processes.


Politik ◽  
2012 ◽  
Vol 15 (2) ◽  
Author(s):  
Malthe Munkøe

Social choice research has shown that collective preference aggregation mechanisms under some conditions will produce arbitrary results, and are prone to endless cycles or strategic manipulation. is prompted Tul- lock (1981) to ask the question “Why so much stability”? at is to say, what explains the discrepancy between these results which implicates that politics is chaotic and random, and general understanding of how politics works in practice. e literature has identi ed a number of mechanisms, including “structure-inducing” in- stitutions that have a stabilizing e ect on the political system. As such it is ultimately an empirical question to what extent a political system is stable or not, and what institutions, norms and arrangements engender stability. is article considers the Danish political system from the point of view of social choice theory and discusses which institutions and arrangements work to stabilize it. 


2016 ◽  
Vol 32 (2) ◽  
pp. 283-321 ◽  
Author(s):  
Matthew D. Adler

Abstract:Preference-aggregation problems arise in various contexts. One such context, little explored by social choice theorists, is metaethical. ‘Ideal-advisor’ accounts, which have played a major role in metaethics, propose that moral facts are constituted by the idealized preferences of a community of advisors. Such accounts give rise to a preference-aggregation problem: namely, aggregating the advisors’ moral preferences. Do we have reason to believe that the advisors, albeit idealized, can still diverge in their rankings of a given set of alternatives? If so, what are the moral facts (in particular, the comparative moral goodness of the alternatives) when the advisors do diverge? These questions are investigated here using the tools of Arrovian social choice theory.


Episteme ◽  
2012 ◽  
Vol 9 (2) ◽  
pp. 115-137 ◽  
Author(s):  
Marcus Pivato

AbstractWe briefly review Condorcet's and Young's epistemic interpretations of preference aggregation rules as maximum likelihood estimators. We then develop a general framework for interpreting epistemic social choice rules as maximum likelihood estimators, maximum a posteriori estimators, or expected utility maximizers. We illustrate this framework with several examples. Finally, we critique this program.


Games ◽  
2021 ◽  
Vol 12 (4) ◽  
pp. 94
Author(s):  
Alexander Mayer ◽  
Stefan Napel

Weighted committees allow shareholders, party leaders, etc. to wield different numbers of votes or voting weights as they decide between multiple candidates by a given social choice method. We consider committees that apply scoring methods such as plurality, Borda, or antiplurality rule. Many different weights induce the same mapping from committee members’ preferences to winning candidates. The numbers of respective weight equivalence classes and hence of structurally distinct plurality committees, Borda commitees, etc. differ widely. There are 6, 51, and 5 plurality, Borda, and antiplurality committees, respectively, if three players choose between three candidates and up to 163 (229) committees for scoring rules in between plurality and Borda (Borda and antiplurality). A key implication is that plurality, Borda, and antiplurality rule are much less sensitive to weight changes than other scoring rules. We illustrate the geometry of weight equivalence classes, with a map of all Borda classes, and identify minimal integer representations.


Author(s):  
R. Packwood ◽  
M.W. Phaneuf ◽  
V. Weatherall ◽  
I. Bassignana

The development of specialized analytical instruments such as the SIMS, XPS, ISS etc., all with truly incredible abilities in certain areas, has given rise to the notion that electron probe microanalysis (EPMA) is an old fashioned and rather inadequate technique, and one that is of little or no use in such high technology fields as the semiconductor industry. Whilst it is true that the microprobe does not possess parts-per-billion sensitivity (ppb) or monolayer depth resolution it is also true that many times these extremes of performance are not essential and that a few tens of parts-per-million (ppm) and a few tens of nanometers depth resolution is all that is required. In fact, the microprobe may well be the second choice method for a wide range of analytical problems and even the method of choice for a few.The literature is replete with remarks that suggest the writer is confusing an SEM-EDXS combination with an instrument such as the Cameca SX-50. Even where this confusion does not exist, the literature discusses microprobe detection limits that are seldom stated to be as low as 100 ppm, whereas there are numerous element combinations for which 10-20 ppm is routinely attainable.


Sign in / Sign up

Export Citation Format

Share Document