Model Uncertainty and Robustness

2016 ◽  
Vol 46 (1) ◽  
pp. 3-40 ◽  
Author(s):  
Cristobal Young ◽  
Katherine Holsteen

Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all combinations of possible controls as well as specified functional form issues, variable definitions, standard error calculations, and estimation commands. This allows analysts to present their core, preferred estimate in the context of a distribution of plausible estimates. Second, we develop a model influence analysis showing how each model ingredient affects the coefficient of interest. This shows which model assumptions, if any, are critical to obtaining an empirical result. We demonstrate the architecture and interpretation of multimodel analysis using data on the union wage premium, gender dynamics in mortgage lending, and tax flight migration among U.S. states. These illustrate how initial results can be strongly robust to alternative model specifications or remarkably dependent on a knife-edge specification.

1992 ◽  
Vol 8 (4) ◽  
pp. 452-475 ◽  
Author(s):  
Jeffrey M. Wooldridge

A test for neglected nonlinearities in regression models is proposed. The test is of the Davidson-MacKinnon type against an increasingly rich set of non-nested alternatives, and is based on sieve estimation of the alternative model. For the case of a linear parametric model, the test statistic is shown to be asymptotically standard normal under the null, while rejecting with probability going to one if the linear model is misspecified. A small simulation study suggests that the test has adequate finite sample properties, but one must guard against over fitting the nonparametric alternative.


1995 ◽  
Vol 24 (2) ◽  
pp. 166-173 ◽  
Author(s):  
Jeff E. Brown ◽  
Don E. Ethridge

A combination of conceptual analysis and empirical analysis—partial regression and residuals analysis—was used to derive an appropriate functional form hedonic price model. These procedures are illustrated in the derivation of a functional form hedonic model for an automated, econometric daily cotton price reporting system for the Texas-Oklahoma cotton market. Following conceptualization to deduce the general shapes of relationships, the appropriate specific functional form was found by testing particular attribute transformations identified from partial regression analysis. Minimizing structural errors across attribute levels and estimation accuracy were used in determining when an appropriate functional form for both implicit and explicit prices was found.


2021 ◽  
Vol 13 (11) ◽  
pp. 2069
Author(s):  
M. V. Alba-Fernández ◽  
F. J. Ariza-López ◽  
M. D. Jiménez-Gamero

The usefulness of the parameters (e.g., slope, aspect) derived from a Digital Elevation Model (DEM) is limited by its accuracy. In this paper, a thematic-like quality control (class-based) of aspect and slope classes is proposed. A product can be compared against a reference dataset, which provides the quality requirements to be achieved, by comparing the product proportions of each class with those of the reference set. If a distance between the product proportions and the reference proportions is smaller than a small enough positive tolerance, which is fixed by the user, it will be considered that the degree of similarity between the product and the reference set is acceptable, and hence that its quality meets the requirements. A formal statistical procedure, based on a hypothesis test, is developed and its performance is analyzed using simulated data. It uses the Hellinger distance between the proportions. The application to the slope and aspect is illustrated using data derived from a 2×2 m DEM (reference) and 5×5 m DEM in Allo (province of Navarra, Spain).


2015 ◽  
Vol 70 (1) ◽  
pp. 133-173 ◽  
Author(s):  
David B. Carter

AbstractViolent nonstate groups are usually weaker than the states they target. Theory suggests that groups carefully condition their choice of tactics on anticipated state response. Yet scholars know very little about whether and how groups strategically plan attacks in anticipation of state response. Scholars do not know if and under what conditions groups employ violent tactics to provoke or avoid a forceful state response, although extant theory is consistent with both possibilities. Relatedly, there is little systematic evidence about why groups choose terrorist or guerrilla tactics and how this choice relates to anticipated state response. I develop a theoretical and empirical model of the interaction between groups and states that generates unique evidence on all three fronts. Using data on attacks in Western Europe from 1950 to 2004, I show that guerrilla attacks are sometimes associated with provoking forceful state response, whereas terrorist attacks are generally associated with avoiding forceful response. Groups effectively choose their tactics to avoid forceful state responses that are too damaging for themselves but provoke forceful responses that disproportionately harm civilians. These findings survive several robustness and model specification tests.


2019 ◽  
Vol 2 (1) ◽  
pp. 5-23
Author(s):  
John Forth ◽  
Alex Bryson

Purpose The literature on the union wage premium is among the most extensive in labour economics but unions’ effects on other aspects of the wage-effort bargain have received much less attention. The purpose of this paper is to contribute to the literature through a study of the union premium in paid holiday entitlements. Design/methodology/approach The authors examine the size of the union premium on paid holidays over time, with a particular focus on how the premium was affected by the introduction of a statutory right to paid holidays. The data come from nationally representative surveys of employees and workplaces. Findings The authors find that the union premium on paid holidays is substantially larger than the union premium on wages. However, the premium fell with the introduction of a statutory minimum entitlement to paid leave. Originality/value This is the first study to examine explicitly the interaction between union representation and the law in this setting. The findings indicate the difficulties that unions have faced in protecting the most vulnerable employees in the UK labour market. The authors argue that the supplanting of voluntary joint regulation with statutory regulation is symptomatic of a wider decrease in the regulatory role of unions in the UK.


ILR Review ◽  
2019 ◽  
Vol 72 (4) ◽  
pp. 1009-1035 ◽  
Author(s):  
Rafael Gomez ◽  
Danielle Lamb

The authors examine the association between unionization and non-standard work in terms of coverage and wages. They use data from the master files of Canada’s Labour Force Survey (LFS) between 1997–98 and 2013–14 to define and measure non-standard work and to provide a continuum of vulnerability across work arrangements. The estimated probability of being employed in some form of non-permanent job increased 2.9 percentage points from 1997 to 2014. During that same period, the estimated probability of being in a non-full-time, non-permanent job—another way of capturing non-standard work—increased 2.5 percentage points. Although estimated union wage premiums declined rather precipitously for all groups, the union wage advantage remained highest among non-standard workers. Further, the authors find the union wage premium is largest for the most vulnerable of non-standard workers. In terms of estimates that look across the earnings distribution, the union wage premium among non-standard workers is larger for workers higher up the earnings profile.


2020 ◽  
Vol 13 (6) ◽  
pp. 1187-1217
Author(s):  
Negin Berjis ◽  
Hadi Shirouyehzad ◽  
Javid Jouzdani

PurposeThe main purpose of this paper is to propose a new approach to determine the project activities weight factors using data envelopment analysis. Afterward, the model is applied in Mobarkeh Steel Company as a case study. Accordingly, the project schedule and plans can be written on the basis of the gained weight factors.Design/methodology/approachThis study proposed an approach to determine the weights of activities using Data Envelopment Analysis. This approach consists of four phases. In the first phase, project activities are extracted based on the work breakdown structure. In the second phase, the parameters affecting the importance of activities are determined through a review of the related literature and based on the experts' opinions. In the third phase, the proper data envelopment analysis model is chosen and the inputs and outputs are signified. Then, the activities' weights are determined based on the efficiency numbers. Finally, the model is solved for the case of Isfahan Mobarakeh Steel Company.FindingsThe proposed method aimed to calculate the project activities weight factor. Thus, influential parameters on project activities importance include activity duration, activity cost, activity importance which includes successors and predecessors, activity difficulty which includes skill related (education and experience), safety, communication rate, intellectual effort, physical effort, unfavorable work conditions and work related hazards, have been recognized. Then, Projects' data were extracted from the organizational expert's opinions and recorded data in documents. Thereupon, applying DEA, the activities weight factor were calculated based on the efficiency numbers. The results show that the model is applicable and has promising benefits in real-world problems.Originality/valuePlanning is one the most fundamental steps of project management. The ever-growing business environment demands for more complex projects with larger number of activities wants more efficient project managers. Organizational resources are limited; therefore, activities planning is a critical from the perspectives of both managers and researchers. Knowing the importance of the activities can help to manage activities more efficient and to allocate time, budget, cost and other resources more accurate. Different elements such as cost, time, complexity, and difficulty can affect the activity weight factor. In this study, the proposed approach aims to determine the weights of activities using Data Envelopment Analysis.


2020 ◽  
Vol 52 (3) ◽  
pp. 385-397
Author(s):  
Ming Su Lavik ◽  
Gudbrand Lien ◽  
Audun Korsaeth ◽  
J. Brian Hardaker

AbstractTo support decision-makers considering adopting integrated pest management (IPM) cropping in Norway, we used stochastic efficiency analysis to compare the risk efficiency of IPM cropping and conventional cropping, using data from a long-term field experiment in southeastern Norway, along with data on recent prices, costs, and subsidies. Initial results were not definitive, so we applied stochastic efficiency with respect to a function, limiting the assumed risk aversion of farmers to a plausible range. We found that, for farmers who are risk-indifferent to moderately (hardly) risk averse, the conventional system was, compared to IPM, less (equally) preferred.


Sign in / Sign up

Export Citation Format

Share Document