scholarly journals Generalized Inflated Discrete Models: A Strategy to Work with Multimodal Discrete Distributions

2017 ◽  
Author(s):  
Tianji Cai ◽  
Yiwei Xia ◽  
Yisu Zhou

Analysts of discrete data often face the challenge of managing the tendency of inflation on certain values. When treated improperly, such phenomenon may lead to biased estimates and incorrect inferences. This study extends the existing literature on single value inflated models, and develops a general framework to handle variables with more than one inflated values. To assess the performance of the proposed maximum likelihood estimator, we conducted Monte Carlo experiments under several scenarios for different levels of inflated probabilities under Multinomial, Ordinal, Poisson, and Zero-Truncated Poisson outcomes with covariates. We found that ignoring the inflations leads to substantial bias and poor inference if the inflations—not only for the intercept(s) of the inflated categories, but other coefficients as well. Specifically, higher values of inflated probabilities are associated with larger biases. By contrast, the Generalized Inflated Discrete models (GIDM) perform well with unbiased estimates and satisfactory coverages even when the number of parameters that need to be estimated is quite large. We showed that model fit criteria such as AIC could be used in selecting appropriate specification of inflated models. Lastly, GIDM was implemented to a large-scale health survey data to compare with conventional modeling approach such as various Poisson, and Ordered Logit models. We showed that GIDM fits the data better in general. The current work provides a practical approach to analyze multimodal data existing in many fields, such as heaping in self-reported behavioral outcomes, inflated categories of indifference and neutral in attitude survey, large amount of zero and low occurance of delinquent behaviors, etc.

2014 ◽  
Author(s):  
Eric Vreeman ◽  
Robert Brett Nelson ◽  
Donna Schnorr

2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Soren Wainio-Theberge ◽  
Annemarie Wolff ◽  
Georg Northoff

AbstractSpontaneous neural activity fluctuations have been shown to influence trial-by-trial variation in perceptual, cognitive, and behavioral outcomes. However, the complex electrophysiological mechanisms by which these fluctuations shape stimulus-evoked neural activity remain largely to be explored. Employing a large-scale magnetoencephalographic dataset and an electroencephalographic replication dataset, we investigate the relationship between spontaneous and evoked neural activity across a range of electrophysiological variables. We observe that for high-frequency activity, high pre-stimulus amplitudes lead to greater evoked desynchronization, while for low frequencies, high pre-stimulus amplitudes induce larger degrees of event-related synchronization. We further decompose electrophysiological power into oscillatory and scale-free components, demonstrating different patterns of spontaneous-evoked correlation for each component. Finally, we find correlations between spontaneous and evoked time-domain electrophysiological signals. Overall, we demonstrate that the dynamics of multiple electrophysiological variables exhibit distinct relationships between their spontaneous and evoked activity, a result which carries implications for experimental design and analysis in non-invasive electrophysiology.


2017 ◽  
Vol 50 (4) ◽  
pp. 197-214 ◽  
Author(s):  
Paul L. Morgan ◽  
Michelle L. Frisco ◽  
George Farkas2 ◽  
Jacob Hibel

Editor’s Note Since the landmark enactment of Education of the Handicapped Act in 1975, special education supports and services have been provided to children with disabilities. Although costly, the intentionality of these specialized services has been to advance the educational and societal opportunities of children with disabilities as they progress to adulthood. For our republished article in this issue of JSE’s 50th anniversary volume, we have selected an article by Paul Morgan, Michelle Frisco, George Farkas, and Jacob Hibel. In this research, Morgan and his colleagues quantified the effectiveness of special education services on children’s learning and behavioral outcomes using large-scale longitudinal data. Their results challenge all education professionals to explore ways to increase the effectiveness of special education and to document research efforts that provide clear evidence that the services and supports provided to individuals with disabilities are improving the extent to which they fully experience the benefits of education and participate fully in society.


2016 ◽  
Vol 5 (1) ◽  
Author(s):  
Dean Eckles ◽  
Brian Karrer ◽  
Johan Ugander

AbstractEstimating the effects of interventions in networks is complicated due to interference, such that the outcomes for one experimental unit may depend on the treatment assignments of other units. Familiar statistical formalism, experimental designs, and analysis methods assume the absence of this interference, and result in biased estimates of causal effects when it exists. While some assumptions can lead to unbiased estimates, these assumptions are generally unrealistic in the context of a network and often amount to assuming away the interference. In this work, we evaluate methods for designing and analyzing randomized experiments under minimal, realistic assumptions compatible with broad interference, where the aim is to reduce bias and possibly overall error in estimates of average effects of a global treatment. In design, we consider the ability to perform random assignment to treatments that is correlated in the network, such as through graph cluster randomization. In analysis, we consider incorporating information about the treatment assignment of network neighbors. We prove sufficient conditions for bias reduction through both design and analysis in the presence of potentially global interference; these conditions also give lower bounds on treatment effects. Through simulations of the entire process of experimentation in networks, we measure the performance of these methods under varied network structure and varied social behaviors, finding substantial bias reductions and, despite a bias–variance tradeoff, error reductions. These improvements are largest for networks with more clustering and data generating processes with both stronger direct effects of the treatment and stronger interactions between units.


Author(s):  
Yiwei Xia ◽  
Yisu Zhou ◽  
Tianji Cai

In this article, we describe the gidm command for fitting generalized inflated discrete models that deal with multiple inflated values in a distribution. Based on the work of Cai, Xia, and Zhou (Forthcoming, Sociological Methods & Research: Generalized inflated discrete models: A strategy to work with multimodal discrete distributions), generalized inflated discrete models are fit via maximum likelihood estimation. Specifically, the gidm command fits Poisson, negative binomial, multinomial, and ordered outcomes with more than one inflated value. We illustrate this command through examples for count and categorical outcomes.


2014 ◽  
Vol 597 ◽  
pp. 283-290 ◽  
Author(s):  
Angelo Masi ◽  
Andrea Digrisolo ◽  
Giuseppe Santarsiero

The knowledge of the materials’ mechanical properties is a preliminary and important step in the seismic vulnerability assessment of existing buildings. In RC structures, the compressive strength of concrete can have a crucial role on the seismic performance and is usually difficult to estimate. Major seismic codes prescribe that concrete strength has to be determined essentially from in-situ and laboratory tests. In some cases such estimation can be complemented by default values in accordance to standards at the time of construction, therefore analysing the actual concrete properties typically found in RC existing buildings realized in different periods can make available useful data. To this end, in this paper attention has been addressed to public buildings, namely schools and hospitals. A large database made up of about 1500 test results on concrete cores extracted from about 300 RC public buildings located in Basilicata region (Italy), has been prepared and analysed. The relationships between the actual strength values (mean and dispersion) and the construction period of buildings have been studied. Theoretical distributions to approximate the discrete distributions of strength values in different construction periods have been determined, thus providing relevant data for the structural assessment of individual buildings and, especially, for large scale vulnerability evaluations.


2018 ◽  
Vol 49 (3) ◽  
pp. 809-834
Author(s):  
Sergio Martínez ◽  
Maria Rueda ◽  
Antonio Arcos ◽  
Helena Martínez

This article discusses the estimation of a population proportion, using the auxiliary information available, which is incorporated into the estimation procedure by a probit model fit. Three probit regression estimators are considered, using model-based and model-assisted approaches. The theoretical properties of the proposed estimators are derived and discussed. Monte Carlo experiments were carried out for simulated data and for real data taken from a database of confirmed dengue cases in Mexico. The probit estimates give valuable results in comparison to alternative estimators. Finally, the proposed methodology is applied to data obtained from an immigration survey.


2019 ◽  
Vol 27 (4) ◽  
pp. 556-571 ◽  
Author(s):  
Laurence Brandenberger

Relational event models are becoming increasingly popular in modeling temporal dynamics of social networks. Due to their nature of combining survival analysis with network model terms, standard methods of assessing model fit are not suitable to determine if the models are specified sufficiently to prevent biased estimates. This paper tackles this problem by presenting a simple procedure for model-based simulations of relational events. Predictions are made based on survival probabilities and can be used to simulate new event sequences. Comparing these simulated event sequences to the original event sequence allows for in depth model comparisons (including parameter as well as model specifications) and testing of whether the model can replicate network characteristics sufficiently to allow for unbiased estimates.


2019 ◽  
pp. 1-13 ◽  
Author(s):  
John Metzcar ◽  
Yafei Wang ◽  
Randy Heiland ◽  
Paul Macklin

Cancer biology involves complex, dynamic interactions between cancer cells and their tissue microenvironments. Single-cell effects are critical drivers of clinical progression. Chemical and mechanical communication between tumor and stromal cells can co-opt normal physiologic processes to promote growth and invasion. Cancer cell heterogeneity increases cancer’s ability to test strategies to adapt to microenvironmental stresses. Hypoxia and treatment can select for cancer stem cells and drive invasion and resistance. Cell-based computational models (also known as discrete models, agent-based models, or individual-based models) simulate individual cells as they interact in virtual tissues, which allows us to explore how single-cell behaviors lead to the dynamics we observe and work to control in cancer systems. In this review, we introduce the broad range of techniques available for cell-based computational modeling. The approaches can range from highly detailed models of just a few cells and their morphologies to millions of simpler cells in three-dimensional tissues. Modeling individual cells allows us to directly translate biologic observations into simulation rules. In many cases, individual cell agents include molecular-scale models. Most models also simulate the transport of oxygen, drugs, and growth factors, which allow us to link cancer development to microenvironmental conditions. We illustrate these methods with examples drawn from cancer hypoxia, angiogenesis, invasion, stem cells, and immunosurveillance. An ecosystem of interoperable cell-based simulation tools is emerging at a time when cloud computing resources make software easier to access and supercomputing resources make large-scale simulation studies possible. As the field develops, we anticipate that high-throughput simulation studies will allow us to rapidly explore the space of biologic possibilities, prescreen new therapeutic strategies, and even re-engineer tumor and stromal cells to bring cancer systems under control.


Sign in / Sign up

Export Citation Format

Share Document