scholarly journals Let the Data Speak? On the Importance of Theory-Based Instrumental Variable Estimations

2019 ◽  
Vol 20 (4) ◽  
pp. e831-e851 ◽  
Author(s):  
Volker Grossmann ◽  
Aderonke Osikominu

Abstract In absence of randomized-controlled experiments, identification is often aimed via instrumental variable (IV) strategies, typically two-stage least squares estimations. According to Bayes’ rule, however, under a low ex ante probability that a hypothesis is true (e.g. that an excluded instrument is partially correlated with an endogenous regressor), the interpretation of the estimation results may be fundamentally flawed. This paper argues that rigorous theoretical reasoning is key to design credible identification strategies, the foremost, finding candidates for valid instruments. We discuss prominent IV analyses from the macro-development literature to illustrate the potential benefit of structurally derived IV approaches.

2019 ◽  
Vol 19 (1) ◽  
Author(s):  
E. R. John ◽  
K. R. Abrams ◽  
C. E. Brightling ◽  
N. A. Sheehan

Abstract Background Recently, there has been a heightened interest in developing and evaluating different methods for analysing observational data. This has been driven by the increased availability of large data resources such as Electronic Health Record (EHR) data alongside known limitations and changing characteristics of randomised controlled trials (RCTs). A wide range of methods are available for analysing observational data. However, various, sometimes strict, and often unverifiable assumptions must be made in order for the resulting effect estimates to have a causal interpretation. In this paper we will compare some common approaches to estimating treatment effects from observational data in order to highlight the importance of considering, and justifying, the relevant assumptions prior to conducting an observational analysis. Methods A simulation study was conducted based upon a small cohort of patients with chronic obstructive pulmonary disease. Two-stage least squares instrumental variables, propensity score, and linear regression models were compared under a range of different scenarios including different strengths of instrumental variable and unmeasured confounding. The effects of violating the assumptions of the instrumental variables analysis were also assessed. Sample sizes of up to 200,000 patients were considered. Results Two-stage least squares instrumental variable methods can yield unbiased treatment effect estimates in the presence of unmeasured confounding provided the sample size is sufficiently large. Adjusting for measured covariates in the analysis reduces the variability in the two-stage least squares estimates. In the simulation study, propensity score methods produced very similar results to linear regression for all scenarios. A weak instrument or strong unmeasured confounding led to an increase in uncertainty in the two-stage least squares instrumental variable effect estimates. A violation of the instrumental variable assumptions led to bias in the two-stage least squares effect estimates. Indeed, these were sometimes even more biased than those from a naïve linear regression model. Conclusions Instrumental variable methods can perform better than naïve regression and propensity scores. However, the assumptions need to be carefully considered and justified prior to conducting an analysis or performance may be worse than if the problem of unmeasured confounding had been ignored altogether.


2013 ◽  
Vol 18 (2) ◽  
pp. 51-63 ◽  
Author(s):  
Sajjad Haider Bhatti ◽  
Jean Bourdon ◽  
Muhammad Aslam

This article estimates the economic returns to schooling as well as analyzing other explanatory factors for the French labor market. It addresses the issue of endogeneity bias and proposes two new instruments for use in the instrumental variable two-stage least squares technique. Our results show that the proposed instruments are relevant and adequate, based on evidence from the available literature. After using the proposed instruments, we find that the OLS coefficients for schooling are biased downwards. Finally, we choose between the two proposed instruments.


Author(s):  
Richard Adelstein

This chapter elaborates the operation of criminal liability by closely considering efficient crimes and the law’s stance toward them, shows how its commitment to proportional punishment prevents the probability scaling that systemically efficient allocation requires, and discusses the procedures that determine the actual liability prices imposed on offenders. Efficient crimes are effectively encouraged by proportional punishment, and their nature and implications are examined. But proportional punishment precludes probability scaling, and induces far more than the systemically efficient number of crimes. Liability prices that match the specific costs imposed by the offender at bar are sought through a two-stage procedure of legislative determination of punishment ranges ex ante and judicial determination of exact prices ex post, which creates a dilemma: whether to price crimes accurately in the past or deter them accurately in the future. An illustrative Supreme Court case bringing all these themes together is discussed in conclusion.


2015 ◽  
Vol 34 (14) ◽  
pp. 2235-2265 ◽  
Author(s):  
Fei Wan ◽  
Dylan Small ◽  
Justin E. Bekelman ◽  
Nandita Mitra

Sign in / Sign up

Export Citation Format

Share Document