scholarly journals Entropy Balancing for Continuous Treatments

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Stefan Tübbicke

Abstract Interest in evaluating the effects of continuous treatments has been on the rise recently. To facilitate the estimation of causal effects in this setting, the present paper introduces entropy balancing for continuous treatments (EBCT) – an intuitive and user-friendly automated covariate balancing scheme – by extending the original entropy balancing methodology of Hainmueller, J. 2012. “Entropy Balancing for Causal Effects: A Multivariate Reweighting Method to Produce Balanced Samples in Observational Studies.” Political Analysis 20 (1): 25–46. In order to estimate balancing weights, the proposed approach solves a globally convex constrained optimization problem, allowing for computationally efficient software implementation. EBCT weights reliably eradicate Pearson correlations between covariates (and their transformations) and the continuous treatment variable. As uncorrelatedness may not be sufficient to guarantee consistent estimates of dose–response functions, EBCT also allows to render higher moments of the treatment variable uncorrelated with covariates to mitigate this issue. Empirical Monte-Carlo simulations suggest that treatment effect estimates using EBCT display favorable properties in terms of bias and root mean squared error, especially when balance on higher moments of the treatment variable is sought. These properties make EBCT an attractive method for the evaluation of continuous treatments. Software implementation is available for Stata and R.

2006 ◽  
Vol 226 (1) ◽  
Author(s):  
Anton L. Flossmann ◽  
Winfried Pohlmeier

SummaryThis paper surveys the empirical evidence on causal effects of education on earnings for Germany and compares alternative studies in the light of their underlying identifying assumptions. We work out the different assumptions taken by various studies, which lead to rather different interpretations of the estimated causal effect. In particular, we are interested in the question to what extend causal return estimates are informative regarding educational policy advice. Despite the substantial methodological differences, we have to conclude that the empirical findings for Germany are quite robust and do not deviate substantially from each other. This also holds for the few studies which rely on ignorability conditions, regardless of whether they use educational attainment as a continuous treatment variable or as a discrete treatment indicator. Own estimates based on the matching approach indicate that the selection into upper secondary schooling is suboptimal


2012 ◽  
Vol 20 (1) ◽  
pp. 25-46 ◽  
Author(s):  
Jens Hainmueller

This paper proposes entropy balancing, a data preprocessing method to achieve covariate balance in observational studies with binary treatments. Entropy balancing relies on a maximum entropy reweighting scheme that calibrates unit weights so that the reweighted treatment and control group satisfy a potentially large set of prespecified balance conditions that incorporate information about known sample moments. Entropy balancing thereby exactly adjusts inequalities in representation with respect to the first, second, and possibly higher moments of the covariate distributions. These balance improvements can reduce model dependence for the subsequent estimation of treatment effects. The method assures that balance improves on all covariate moments included in the reweighting. It also obviates the need for continual balance checking and iterative searching over propensity score models that may stochastically balance the covariate moments. We demonstrate the use of entropy balancing with Monte Carlo simulations and empirical applications.


2015 ◽  
Vol 3 (1) ◽  
pp. 25-40 ◽  
Author(s):  
Yeying Zhu ◽  
Donna L. Coffman ◽  
Debashis Ghosh

AbstractIn this article, we study the causal inference problem with a continuous treatment variable using propensity score-based methods. For a continuous treatment, the generalized propensity score is defined as the conditional density of the treatment-level given covariates (confounders). The dose–response function is then estimated by inverse probability weighting, where the weights are calculated from the estimated propensity scores. When the dimension of the covariates is large, the traditional nonparametric density estimation suffers from the curse of dimensionality. Some researchers have suggested a two-step estimation procedure by first modeling the mean function. In this study, we suggest a boosting algorithm to estimate the mean function of the treatment given covariates. In boosting, an important tuning parameter is the number of trees to be generated, which essentially determines the trade-off between bias and variance of the causal estimator. We propose a criterion called average absolute correlation coefficient (AACC) to determine the optimal number of trees. Simulation results show that the proposed approach performs better than a simple linear approximation or L2 boosting. The proposed methodology is also illustrated through the Early Dieting in Girls study, which examines the influence of mothers’ overall weight concern on daughters’ dieting behavior.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Carlos A. Ruiz-Perez ◽  
Roth E. Conrad ◽  
Konstantinos T. Konstantinidis

Abstract Background High-throughput sequencing has increased the number of available microbial genomes recovered from isolates, single cells, and metagenomes. Accordingly, fast and comprehensive functional gene annotation pipelines are needed to analyze and compare these genomes. Although several approaches exist for genome annotation, these are typically not designed for easy incorporation into analysis pipelines, do not combine results from different annotation databases or offer easy-to-use summaries of metabolic reconstructions, and typically require large amounts of computing power for high-throughput analysis not available to the average user. Results Here, we introduce MicrobeAnnotator, a fully automated, easy-to-use pipeline for the comprehensive functional annotation of microbial genomes that combines results from several reference protein databases and returns the matching annotations together with key metadata such as the interlinked identifiers of matching reference proteins from multiple databases [KEGG Orthology (KO), Enzyme Commission (E.C.), Gene Ontology (GO), Pfam, and InterPro]. Further, the functional annotations are summarized into Kyoto Encyclopedia of Genes and Genomes (KEGG) modules as part of a graphical output (heatmap) that allows the user to quickly detect differences among (multiple) query genomes and cluster the genomes based on their metabolic similarity. MicrobeAnnotator is implemented in Python 3 and is freely available under an open-source Artistic License 2.0 from https://github.com/cruizperez/MicrobeAnnotator. Conclusions We demonstrated the capabilities of MicrobeAnnotator by annotating 100 Escherichia coli and 78 environmental Candidate Phyla Radiation (CPR) bacterial genomes and comparing the results to those of other popular tools. We showed that the use of multiple annotation databases allows MicrobeAnnotator to recover more annotations per genome compared to faster tools that use reduced databases and is computationally efficient for use in personal computers. The output of MicrobeAnnotator can be easily incorporated into other analysis pipelines while the results of other annotation tools can be seemingly incorporated into MicrobeAnnotator to generate summary plots.


Author(s):  
Mastan Sharif Shaik ◽  
K. Satya Prasad ◽  
Rafi Ahamed Shaik ◽  
D. Venkata Rao

Several sign based LMS adaptive filters, which are computationally free having multiplier free weight update loops, are proposed for acoustic echo cancellation. The adaptive filters essentially minimizes the mean- squared error between a primary input, which is the echo, and a reference input, which is either echo that is correlated in some way with the echo in the primary input. The results show that the performance of the signed regressor. LMS algorithm is superior than conventional LMS algorithm, the performance of signed LMS and sign- sign LMS based realizations are comparable to that of the LMS based filtering techniques in terms of Average Attenuation and computational complexity.


Author(s):  
Mastan Sharif Shaik ◽  
K. Satya Prasad ◽  
Rafi Ahamed Shaik ◽  
D. Venkata Rao

Several sign based LMS adaptive filters, which are computationally free having multiplier free weight update loops, are proposed for acoustic echo cancellation. The adaptive filters essentially minimizes the mean-squared error between a primary input, which is the echo, and a reference input, which is either echo that is correlated in some way with the echo in the primary input. The results show that the performance of the signed regressor. LMS algorithm is superior than conventional LMS algorithm, the performance of signed LMS and sign-sign LMS based realizations are comparable to that of the LMS based filtering techniques in terms of Average Attenuation and computational complexity.


1994 ◽  
Vol 28 (5) ◽  
pp. 415-458 ◽  
Author(s):  
David B. Koconis ◽  
Låszló P. Kollår ◽  
George S. Springer

The changes in shapes of fiber-reinforced composite beams, plates and shells affected by embedded piezoelectric actuators were investigated. An analytical method was developed which can be used to calculate the changes in shapes for specified applied voltages to the actuators. The method is formulated on the basis of mathematical models using two-dimensional, linear, shallow shell theory including transverse shear effects which are important in the case of sandwich construction. Solutions to the governing equations were obtained via the Ritz method. A computationally efficient computer code with a user-friendly interface was written which is suitable for performing the numerical calculations. The code, designated as SHAPE1, provides the change in shape for specified applied voltages. To validate the method and the computer code, results generated by the code were compared to existing analytical and experimental results and to test data obtained during the course of the present investigation. The predictions provided by the SHAPE1 code were in excellent agreement with the results of the other analyses and data.


2020 ◽  
Author(s):  
Ali Ghazizadeh ◽  
Frederic Ambroggi

AbstractPeri-event time histograms (PETH) are widely used to study correlations between experimental events and neuronal firing. The accuracy of firing rate estimate using a PETH depends on the choice of binsize. We show that the optimal binsize for a PETH depends on factors such as the number of trials and the temporal dynamics of the firing rate. These factors argue against the use of a one-size-fits-all binsize when making PETHs for an inhomogeneous population of neurons. Here we propose a binsize selection method by adapting the Akaike Information Criterion (AIC). Simulations show that optimal binsizes estimated by AIC closely match the optimal binsizes using mean squared error (MSE). Furthermore, using real data, we find that optimal binning improves detection of responses and their dynamics. Together our analysis strongly supports optimal binning of PETHs and proposes a computationally efficient method for this optimization based on AIC approach to model selection.


2016 ◽  
Vol 24 (2) ◽  
pp. 157-171 ◽  
Author(s):  
John Marshall

Political scientists increasingly use instrumental variable (IV) methods, and must often choose between operationalizing their endogenous treatment variable as discrete or continuous. For theoretical and data availability reasons, researchers frequently coarsen treatments with multiple intensities (e.g., treating a continuous treatment as binary). I show how such coarsening can substantially upwardly bias IV estimates by subtly violating the exclusion restriction assumption, and demonstrate that the extent of this bias depends upon the first stage and underlying causal response function. However, standard IV methods using a treatment where multiple intensities are affected by the instrument–even when fine-grained measurement at every intensity is not possible–recover a consistent causal estimate without requiring a stronger exclusion restriction assumption. These analytical insights are illustrated in the context of identifying the long-run effect of high school education on voting Conservative in Great Britain. I demonstrate that coarsening years of schooling into an indicator for completing high school upwardly biases the IV estimate by a factor of three.


1994 ◽  
Vol 28 (5) ◽  
pp. 459-482 ◽  
Author(s):  
David B. Koconis ◽  
Låszló P. Kollår ◽  
George S. Springer

The changes in shapes of fiber-reinforced composite beams, plates and shells affected by embedded piezoelectric actuators were investigated. An analytical method was developed to determine the voltages needed to achieve a specified desired shape. The method is formulated on the basis of mathematical models using two-dimensional, linear, shallow shell theory including transverse shear effects which are important in the case of sandwich construction. The solution technique is a minimization of an error function which is a measure of the difference between the deformed shape caused by the application of voltages and the desired shape. A computationally efficient, user-friendly computer code was written which is suitable for performing the numerical calculations. The code, designated as SHAPE2, gives the voltages needed to achieve specified changes in shape. To validate the method and the computer code, results generated by the code were compared to existing analytical and experimental results. The predictions provided by the SHAPE2 code were in excellent agreement with the results of the other analyses and data.


Sign in / Sign up

Export Citation Format

Share Document