Playing Favorites: How Shared Beliefs Shape the IMF's Lending Decisions

2014 ◽  
Vol 68 (2) ◽  
pp. 297-328 ◽  
Author(s):  
Stephen C. Nelson

AbstractInternational organizations (IOs) suffuse world politics, but the International Monetary Fund (IMF) stands out as an unusually important IO. My research suggests that IMF lending is systematically biased. Preferential treatment is largely driven by the degree of similarity between beliefs held by IMF officials and key economic policy-makers in the borrowing country. This article describes the IMF's ideational culture as “neoliberal,” and assumes it to be stable during the observation window (1980–2000). The beliefs of top economic policy-makers in borrowing countries, however, vary in terms of their distance from IMF officials' beliefs. When fellow neoliberals control the top economic policy posts the distance between the means of the policy team's beliefs and the IMF narrows; consequently, IMF loans become less onerous, more generous, and less rigorously enforced. I gathered data on the number of conditions and the relative size of loans for 486 programs in the years between 1980 and 2000. I collected data on waivers, which allow countries that have missed binding conditions to continue to access funds, as an indicator for enforcement. I rely on indirect indicators, gleaned from a new data set that contains biographical details of more than 2,000 policy-makers in ninety developing countries, to construct a measure of the proportion of the top policy officials that are fellow neoliberals. The evidence from a battery of statistical tests reveals that as the proportion of neoliberals in the borrowing government increases, IMF deals get comparatively sweeter.

Author(s):  
Georg Löfflmann

This chapter provides a summary of the book’s findings. The chapter argues that the geopolitical vision of a more restrained leadership role and more cautious global engagement Obama formulated was reflecting the post-American future rather than the hegemonic past of America’s role in world politics. It assesses that most influential scholars, pundits and policy makers in turn remained embedded in the Washington consensus of hegemony and mired in a unipolar worldview. The chapter identifies a further fracturing of the grand strategy consensus, between elite opinion and the foreign policy establishment denouncing ‘isolationist’ tendencies, and an American public increasingly in favour of non-interventionism and in acceptance of a less singular hegemonic role. The chapter briefly reviews how this conflict was also encapsulated in the contest for Obama’s succession between Hillary Clinton, a quintessential Washington insider and firm believer in America’s role as the world’s indispensable nation and Donald Trump, an anti-establishment populist that had aggressively questioned the elite consensus on US foreign and economic policy.


1982 ◽  
Vol 21 (3) ◽  
pp. 255-257
Author(s):  
Zafar Mahmood

The world in its politico-economic aspects is run by policy-makers who have an academic background in law or public administration or other related social disciplines including economics. Only rarely would a majority of the policy-makers be trained in economics. In the making of economic policy, the basic choices before the policy-makers are political and they transcend the narrow concerns of economists regarding optimal use of resources. These considerations in no way downgrade the relevance of economic analysis in economic policy-making and for the training of policy-maker in economics. Policy-makers need economic council to understand fully the implications of alternative policy options. In this book, Wolfson attempts to educate policy-makers in the areas of public finance and development strategy. The analysis avoids technicalities and is kept to a simple level to make it understandable to civil servants, law-makers and members of the executive branch whom Wolfson refers to as policy-makers. Simplicity of analysis is not the only distinguishing mark of this book. Most other books on public finance are usually addressed to traditional public finance issues relating to both the revenue and expenditure sides of the budget and neglect an overall mix of issues dealing with the interaction of fiscal policy with economic development. Wolfson in this book explicitly deals with these issues.


2014 ◽  
Vol 112 (11) ◽  
pp. 2729-2744 ◽  
Author(s):  
Carlo J. De Luca ◽  
Joshua C. Kline

Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization.


2009 ◽  
Vol 9 (4) ◽  
pp. 14-40 ◽  
Author(s):  
Frank Biermann ◽  
Philipp Pattberg ◽  
Harro van Asselt ◽  
Fariborz Zelli

Most research on global governance has focused either on theoretical accounts of the overall phenomenon or on empirical studies of distinct institutions that serve to solve particular governance challenges. In this article we analyze instead “governance architectures,” defined as the overarching system of public and private institutions, principles, norms, regulations, decision-making procedures and organizations that are valid or active in a given issue area of world politics. We focus on one aspect that is turning into a major source of concern for scholars and policy-makers alike: the “fragmentation” of governance architectures in important policy domains. The article offers a typology of different degrees of fragmentation, which we describe as synergistic, cooperative, and conflictive fragmentation. We then systematically assess alternative hypotheses over the relative advantages and disadvantages of different degrees of fragmentation. We argue that moderate degrees of fragmentation may entail both significant costs and benefits, while higher degrees of fragmentation are likely to decrease the overall performance of a governance architecture. The article concludes with policy options on how high degrees of fragmentation could be reduced. Fragmentation is prevalent in particular in the current governance of climate change, which we have hence chosen as illustration for our discussion.


2016 ◽  
Vol 16 (24) ◽  
pp. 15545-15559 ◽  
Author(s):  
Ernesto Reyes-Villegas ◽  
David C. Green ◽  
Max Priestman ◽  
Francesco Canonaco ◽  
Hugh Coe ◽  
...  

Abstract. The multilinear engine (ME-2) factorization tool is being widely used following the recent development of the Source Finder (SoFi) interface at the Paul Scherrer Institute. However, the success of this tool, when using the a value approach, largely depends on the inputs (i.e. target profiles) applied as well as the experience of the user. A strategy to explore the solution space is proposed, in which the solution that best describes the organic aerosol (OA) sources is determined according to the systematic application of predefined statistical tests. This includes trilinear regression, which proves to be a useful tool for comparing different ME-2 solutions. Aerosol Chemical Speciation Monitor (ACSM) measurements were carried out at the urban background site of North Kensington, London from March to December 2013, where for the first time the behaviour of OA sources and their possible environmental implications were studied using an ACSM. Five OA sources were identified: biomass burning OA (BBOA), hydrocarbon-like OA (HOA), cooking OA (COA), semivolatile oxygenated OA (SVOOA) and low-volatility oxygenated OA (LVOOA). ME-2 analysis of the seasonal data sets (spring, summer and autumn) showed a higher variability in the OA sources that was not detected in the combined March–December data set; this variability was explored with the triangle plots f44 : f43 f44 : f60, in which a high variation of SVOOA relative to LVOOA was observed in the f44 : f43 analysis. Hence, it was possible to conclude that, when performing source apportionment to long-term measurements, important information may be lost and this analysis should be done to short periods of time, such as seasonally. Further analysis on the atmospheric implications of these OA sources was carried out, identifying evidence of the possible contribution of heavy-duty diesel vehicles to air pollution during weekdays compared to those fuelled by petrol.


2017 ◽  
Vol 49 (3) ◽  
pp. 382-399 ◽  
Author(s):  
JOSHUA BERNING ◽  
ADAM N. RABINOWITZ

AbstractWe examine the relationship of product characteristics of ready-to-eat breakfast cereal and targeted television advertising to specific consumer segments. We compile a unique data set that includes brand-packaging characteristics, including on-box games, nutrition information, and cobranding. We find that the relationship of television advertising and a cereal's brand-packaging characteristics varies by target audience. Our results provide insight into understanding how manufacturers strategically utilize branding, packaging, and television advertising. This can help industry and policy makers develop food product advertising policy. This analysis extends to other product markets where extensive product differentiation and promotion are present as well.


2008 ◽  
Vol 24 (2) ◽  
pp. 403-432 ◽  
Author(s):  
Nimat Hafez Barazangi

This paper explores the ethical and legal pedagogy of the current debates on “reforming” Muslim societies, whether they claim to reform social and legal systems, reform educational institutions, or liberate Muslim women. Since these debates claim to achieve balance in global or domestic conflicts, I address the foundations of these debates by answering three questions:Are the rationales for American and/or European governments' interventions justified?;Can the discipline of civil law help in rethinking Islam for Muslims; andAre Muslims themselves ready to critically address the use and misuse of Islam's primary sources (the Qur'an and particularly the Hadith) in their rethinking of Islam?I argue that rather than seeking to “reform others,” in this case Muslims with an elitist attitude and sometimes violent interventions, we scholars of law and religion, scholars of Islam, policy-makers, and social justice researchers would be better off if:we thought of Islam as a religio-moral rational worldview, rather than a set of laws,we recognized Muslims as subject to historical transformation, like any other religious groups, and understood how they developed their present views of Islam, andwe considered our own real responsibilities to address the forms of global injustices as powerful shapers of world politics, particularly the politics of difference—the view that the “other” is inferior, and women's role as mostly complementary to men.


Author(s):  
Emery R. Boose ◽  
Barbara S. Lerner

The metadata that describe how scientific data are created and analyzed are typically limited to a general description of data sources, software used, and statistical tests applied and are presented in narrative form in the methods section of a scientific paper or a data set description. Recognizing that such narratives are usually inadequate to support reproduction of the analysis of the original work, a growing number of journals now require that authors also publish their data. However, finer-scale metadata that describe exactly how individual items of data were created and transformed and the processes by which this was done are rarely provided, even though such metadata have great potential to improve data set reliability. This chapter focuses on the detailed process metadata, called “data provenance,” required to ensure reproducibility of analyses and reliable re-use of the data.


2014 ◽  
Vol 14 (13) ◽  
pp. 19747-19789
Author(s):  
F. Tan ◽  
H. S. Lim ◽  
K. Abdullah ◽  
T. L. Yoon ◽  
B. Holben

Abstract. In this study, the optical properties of aerosols in Penang, Malaysia were analyzed for four monsoonal seasons (northeast monsoon, pre-monsoon, southwest monsoon, and post-monsoon) based on data from the AErosol RObotic NETwork (AERONET) from February 2012 to November 2013. The aerosol distribution patterns in Penang for each monsoonal period were quantitatively identified according to the scattering plots of the aerosol optical depth (AOD) against the Angstrom exponent. A modified algorithm based on the prototype model of Tan et al. (2014a) was proposed to predict the AOD data. Ground-based measurements (i.e., visibility and air pollutant index) were used in the model as predictor data to retrieve the missing AOD data from AERONET because of frequent cloud formation in the equatorial region. The model coefficients were determined through multiple regression analysis using selected data set from in situ data. The predicted AOD of the model was generated based on the coefficients and compared against the measured data through standard statistical tests. The predicted AOD in the proposed model yielded a coefficient of determination R2 of 0.68. The corresponding percent mean relative error was less than 0.33% compared with the real data. The results revealed that the proposed model efficiently predicted the AOD data. Validation tests were performed on the model against selected LIDAR data and yielded good correspondence. The predicted AOD can beneficially monitor short- and long-term AOD and provide supplementary information in atmospheric corrections.


2015 ◽  
Vol 54 (1) ◽  
pp. 3-34 ◽  
Author(s):  
Michael A. Gottfried

Although educational policy makers uphold that chronic absenteeism (missing 10% or more of the school year) is damaging to students’ schooling outcomes, there is little empirical research to match. This study considers the role of spillover effects of chronic absenteeism on classmates’ achievement. It does so by utilizing a large-scale administrative urban district data set of elementary schoolchildren—a sample of students where the rates of chronic absenteeism are expected to be higher compared with the national average. The results show that students suffer academically from having chronically absent classmates—as exhibited across both reading and math testing outcomes. Chronic absenteeism not only had a damaging effect on those individuals missing excessive school days but also has the potential to reduce outcomes for others in the same educational setting.


Sign in / Sign up

Export Citation Format

Share Document