scholarly journals The price of polarization: Estimating task prices under routine‐biased technical change

2020 ◽  
Vol 11 (2) ◽  
pp. 761-799 ◽  
Author(s):  
Michael J. Böhm

This paper proposes a new approach to estimate task prices per efficiency unit of skill in the Roy model. I show how the sorting of workers into tasks and their associated wage growth can be used to identify changes in task prices under relatively weak assumptions. The estimation exploits the fact that the returns to observable talents will change differentially over time depending on the changes in prices of those tasks that they predict workers to sort into. In the generalized Roy model, also the average non‐pecuniary amenities in each task are identified. I apply this approach to the literature on routine‐biased technical change, a key prediction of which is that task prices should polarize. Empirical results for male workers in U.S. data indicate that abstract and manual tasks' relative prices indeed increased during the 1990s and 2000s.

2020 ◽  
pp. 1-34
Author(s):  
Michele Battisti ◽  
Gianfranco di Vaio ◽  
Joseph Zeira

Recently, Penn World Tables include new data that enable calculation of total factor productivity in addition to output for a large set of countries. We use these new data to examine convergence and divergence across countries by applying a new approach, which differentiates between the dynamics of output and of productivity. Our empirical results lead to two main new contributions to the literature. The first is on the interpretation of “β-convergence” in “growth regressions.” It means that output per worker in each country converges to productivity but does not imply convergence across countries, since productivity tends to diverge from the global frontier. The second contribution is to the literature, which finds that income gaps across countries are due mainly to differential technology adoption. This paper shows that the gaps in technology are not only large but keep growing over time.


Author(s):  
William W. Franko ◽  
Christopher Witko

The authors conclude the book by recapping their arguments and empirical results, and discussing the possibilities for the “new economic populism” to promote egalitarian economic outcomes in the face of continuing gridlock and the dominance of Washington, DC’s policymaking institutions by business and the wealthy, and a conservative Republican Party. Many states are actually addressing inequality now, and these policies are working. Admittedly, many states also continue to embrace the policies that have contributed to growing inequality, such as tax cuts for the wealthy or attempting to weaken labor unions. But as the public grows more concerned about inequality, the authors argue, policies that help to address these income disparities will become more popular, and policies that exacerbate inequality will become less so. Over time, if history is a guide, more egalitarian policies will spread across the states, and ultimately to the federal government.


2008 ◽  
Vol 37 (4) ◽  
pp. 597-620 ◽  
Author(s):  
MARK TOMLINSON ◽  
ROBERT WALKER ◽  
GLENN WILLIAMS

AbstractWhile poverty is widely accepted to be an inherently multi-dimensional concept, it has proved very difficult to develop measures that both capture this multi-dimensionality and facilitate comparison of trends over time. Structural equation modelling appears to offer a solution to this conundrum and is used to exploit the British Household Panel Study to create a multi-dimensional measure of poverty. The analysis reveals that the decline in poverty in Britain between 1991 and 2003 was driven by falls in material deprivation, but more especially by reduced financial stress, particularly during the early 1990s. The limitations and potential of the new approach are critically discussed.


2009 ◽  
pp. 89-113
Author(s):  
Fabrizio Traů

- The paper aims at discussing the logic lying behind sixty years of industrial policy in Italy. It is argued that during this time State intervention has been characterised by the issue of an increasing number of laws (mostly persisting over time) devoted to specific objectives, but at the same time paralleled by a tendency towards the reduction of their selectivity through the widening (i.e. the loosening) of the boundaries of the universe of firms they were thought for. Such a logic seems to have made way in recent years for a relatively new approach, as stated in the program "Industria 2015", which has put at the centre of the stage the need for limiting State aid to a selected group of (horizontally identified) industrial activities. The paper also discusses some apparent shortcomings of this approach, emphasising that a risk for a new weakening of its selective logic is still at work.


1996 ◽  
Vol 6 (3) ◽  
pp. 327-358
Author(s):  
LaRue Tone Hosmer

AbstractThe first issue of BusinessEthics Quarterlyappeared five years ago. This article classifies the content of the 141 articles that have appeared since that time along 18 dimensions, and 118 categories within those dimensions, to determine trends within the discipline. The major trend appears to be a shift in focus towards the increased discussion of a new approach/paradigm for the field, and towards a normative/descriptive interface of the theory. The major problem seems to be a lack of explicit conceptual definition and beginning empirical effort to support that new focus, which may thus prove unsustainable over time.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Peng Su ◽  
Dan Zhu ◽  
Daniel Zeng

Knowledge is considered actionable if users can take direct actions based on such knowledge to their advantage. Among the most important and distinctive actionable knowledge are actionable behavioral rules that can directly and explicitly suggest specific actions to take to influence (restrain or encourage) the behavior in the users’ best interest. However, in mining such rules, it often occurs that different rules may suggest the same actions with different expected utilities, which we call conflicting rules. To resolve the conflicts, a previous valid method was proposed. However, inconsistency of the measure for rule evaluating may hinder its performance. To overcome this problem, we develop a new method that utilizes rule ranking procedure as the basis for selecting the rule with the highest utility prediction accuracy. More specifically, we propose an integrative measure, which combines the measures of the support and antecedent length, to evaluate the utility prediction accuracies of conflicting rules. We also introduce a tunable weight parameter to allow the flexibility of integration. We conduct several experiments to test our proposed approach and evaluate the sensitivity of the weight parameter. Empirical results indicate that our approach outperforms those from previous research.


Author(s):  
Francesco Caselli

This chapter concludes that the book has presented evidence showing that technology and technical change are more flexible than generally allowed. The efficiency of different factors changes across countries and over time at different rates. Indeed, in some instances the efficiency with which one factor is used can decline while the efficiency of others increases. Since the 1990s, it has been increasingly clear that technical change tends to have a skill bias, but this book's findings reveal that nonneutralities are much more pervasive than that. They also occur across countries, and not just over time. Furthermore, they invest a broader set of inputs: not only skilled and unskilled labor, but also experienced and inexperienced workers, natural and reproducible capital, and a broad labor aggregate and a broad capital aggregate. The book has merely scratched the surface of the likely patterns of nonneutrality that exist across countries and over time.


Author(s):  
Tudor Bălănescu ◽  
Radu Nicolescu ◽  
Huiling Wu

In this paper, the authors propose a new approach to fully asynchronous P systems, and a matching complexity measure, both inspired from the field of distributed algorithms. The authors validate the proposed approach by implementing several well-known distributed depth-first search (DFS) and breadth-first search (BFS) algorithms. Empirical results show that the proposed P algorithms have shorter descriptions and achieve a performance comparable to the corresponding distributed algorithms.


2021 ◽  
pp. 169-226
Author(s):  
James Woodward

This chapter explores some empirical results bearing on the descriptive and normative adequacy of different accounts of causal learning and representation. It begins by contrasting associative accounts with accounts that attribute additional structure to causal representation, arguing in favor of the latter. Empirical results supporting the claim that adult humans often reason about causal relationships using interventionist counterfactuals are presented. Contrasts between human and nonhuman primate causal cognition are also discussed, as well as some experiments concerning causal cognition in young children. A proposal about what is involved in having adult human causal representations is presented and some issues about how these might develop over time are explored.


Sign in / Sign up

Export Citation Format

Share Document