scholarly journals An impossibility theorem for matching problems

2010 ◽  
Vol 35 (2) ◽  
pp. 245-266 ◽  
Author(s):  
Shohei Takagi ◽  
Shigehiro Serizawa
2006 ◽  
Author(s):  
Shohei Takagi ◽  
Shigehiro Serizawa

2020 ◽  
Vol 19 (12) ◽  
pp. 2358-2371
Author(s):  
S.A. Moskal'onov

Subject. The article addresses the history of development and provides the criticism of existing criteria for aggregate social welfare (on the simple exchange economy (the Edgeworth box) case). Objectives. The purpose is to develop a unique classification of criteria to assess the aggregate social welfare. Methods. The study draws on methods of logical and mathematical analysis. Results. The paper considers strong, strict and weak versions of the Pareto, Kaldor, Hicks, Scitovsky, and Samuelson criteria, introduces the notion of equivalence and constructs orderings by Pareto, Kaldor, Hicks, Scitovsky, and Samuelson. The Pareto and Samuelson's criteria are transitive, however, not complete. The Kaldor, Hicks, Scitovsky citeria are not transitive in the general case. Conclusions. The lack of an ideal social welfare criterion is the consequence of the Arrow’s Impossibility Theorem, and of the group of impossibility theorems in economics. It is necessary to develop new approaches to the assessment of aggregate welfare.


Author(s):  
Adam M. Brandenburger ◽  
H. Jerome Keisler

Entropy ◽  
2020 ◽  
Vol 23 (1) ◽  
pp. 31
Author(s):  
Ivan Markić ◽  
Maja Štula ◽  
Marija Zorić ◽  
Darko Stipaničev

The string-matching paradigm is applied in every computer science and science branch in general. The existence of a plethora of string-matching algorithms makes it hard to choose the best one for any particular case. Expressing, measuring, and testing algorithm efficiency is a challenging task with many potential pitfalls. Algorithm efficiency can be measured based on the usage of different resources. In software engineering, algorithmic productivity is a property of an algorithm execution identified with the computational resources the algorithm consumes. Resource usage in algorithm execution could be determined, and for maximum efficiency, the goal is to minimize resource usage. Guided by the fact that standard measures of algorithm efficiency, such as execution time, directly depend on the number of executed actions. Without touching the problematics of computer power consumption or memory, which also depends on the algorithm type and the techniques used in algorithm development, we have developed a methodology which enables the researchers to choose an efficient algorithm for a specific domain. String searching algorithms efficiency is usually observed independently from the domain texts being searched. This research paper aims to present the idea that algorithm efficiency depends on the properties of searched string and properties of the texts being searched, accompanied by the theoretical analysis of the proposed approach. In the proposed methodology, algorithm efficiency is expressed through character comparison count metrics. The character comparison count metrics is a formal quantitative measure independent of algorithm implementation subtleties and computer platform differences. The model is developed for a particular problem domain by using appropriate domain data (patterns and texts) and provides for a specific domain the ranking of algorithms according to the patterns’ entropy. The proposed approach is limited to on-line exact string-matching problems based on information entropy for a search pattern. Meticulous empirical testing depicts the methodology implementation and purports soundness of the methodology.


Author(s):  
Alec Sandroni ◽  
Alvaro Sandroni

AbstractArrow (1950) famously showed the impossibility of aggregating individual preference orders into a social preference order (together with basic desiderata). This paper shows that it is possible to aggregate individual choice functions, that satisfy almost any condition weaker than WARP, into a social choice function that satisfy the same condition (and also Arrow’s desiderata).


2013 ◽  
Vol 103 (2) ◽  
pp. 585-623 ◽  
Author(s):  
Eric Budish ◽  
Yeon-Koo Che ◽  
Fuhito Kojima ◽  
Paul Milgrom

Randomization is commonplace in everyday resource allocation. We generalize the theory of randomized assignment to accommodate multi-unit allocations and various real-world constraints, such as group-specific quotas (“controlled choice”) in school choice and house allocation, and scheduling and curriculum constraints in course allocation. We develop new mechanisms that are ex ante efficient and fair in these environments, and that incorporate certain non-additive substitutable preferences. We also develop a “utility guarantee” technique that limits ex post unfairness in random allocations, supplementing the ex ante fairness promoted by randomization. This can be applied to multi-unit assignment problems and certain two-sided matching problems. (JEL C78, D82)


Sign in / Sign up

Export Citation Format

Share Document