scholarly journals Identification and Estimation of Graphical Models with Nonignorable Nonresponse

2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Lingju Chen ◽  
Shaoxin Hong ◽  
Bo Tang

We study the identification and estimation of graphical models with nonignorable nonresponse. An observable variable correlated to nonresponse is added to identify the mean of response for the unidentifiable model. An approach to estimating the marginal mean of response is proposed, based on simulation imputation methods which are introduced for a variety of models including linear, generalized linear, and monotone nonlinear models. The proposed mean estimators are N -consistent, where N is the sample size. Finite sample simulations confirm the effectiveness of the proposed method. Sensitivity analysis for the untestable assumption on our augmented model is also conducted. A real data example is employed to illustrate the use of the proposed methodology.

2002 ◽  
Vol 18 (5) ◽  
pp. 1019-1039 ◽  
Author(s):  
Tucker McElroy ◽  
Dimitris N. Politis

The problem of statistical inference for the mean of a time series with possibly heavy tails is considered. We first show that the self-normalized sample mean has a well-defined asymptotic distribution. Subsampling theory is then used to develop asymptotically correct confidence intervals for the mean without knowledge (or explicit estimation) either of the dependence characteristics, or of the tail index. Using a symmetrization technique, we also construct a distribution estimator that combines robustness and accuracy: it is higher-order accurate in the regular case, while remaining consistent in the heavy tailed case. Some finite-sample simulations confirm the practicality of the proposed methods.


2017 ◽  
Vol 13 (1) ◽  
Author(s):  
Rajeshwari Sundaram ◽  
Ling Ma ◽  
Subhashis Ghoshal

AbstractRecurrent events are often encountered in medical follow up studies. In addition, such recurrences have other quantities associated with them that are of considerable interest, for instance medical costs of the repeated hospitalizations and tumor size in cancer recurrences. These processes can be viewed as point processes, i.e. processes with arbitrary positive jump at each recurrence. An analysis of the mean function for such point processes have been proposed in the literature. However, such point processes are often skewed, leading to median as a more appropriate measure than the mean. Furthermore, the analysis of recurrent event data is often complicated by the presence of death. We propose a semiparametric model for assessing the effect of covariates on the quantiles of the point processes. We investigate both the finite sample as well as the large sample properties of the proposed estimators. We conclude with a real data analysis of the medical cost associated with the treatment of ovarian cancer.


2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Hanji He ◽  
Guangming Deng

We extend the mean empirical likelihood inference for response mean with data missing at random. The empirical likelihood ratio confidence regions are poor when the response is missing at random, especially when the covariate is high-dimensional and the sample size is small. Hence, we develop three bias-corrected mean empirical likelihood approaches to obtain efficient inference for response mean. As to three bias-corrected estimating equations, we get a new set by producing a pairwise-mean dataset. The method can increase the size of the sample for estimation and reduce the impact of the dimensional curse. Consistency and asymptotic normality of the maximum mean empirical likelihood estimators are established. The finite sample performance of the proposed estimators is presented through simulation, and an application to the Boston Housing dataset is shown.


Econometrics ◽  
2021 ◽  
Vol 9 (1) ◽  
pp. 10
Author(s):  
Šárka Hudecová ◽  
Marie Hušková ◽  
Simos G. Meintanis

This article considers goodness-of-fit tests for bivariate INAR and bivariate Poisson autoregression models. The test statistics are based on an L2-type distance between two estimators of the probability generating function of the observations: one being entirely nonparametric and the second one being semiparametric computed under the corresponding null hypothesis. The asymptotic distribution of the proposed tests statistics both under the null hypotheses as well as under alternatives is derived and consistency is proved. The case of testing bivariate generalized Poisson autoregression and extension of the methods to dimension higher than two are also discussed. The finite-sample performance of a parametric bootstrap version of the tests is illustrated via a series of Monte Carlo experiments. The article concludes with applications on real data sets and discussion.


Biometrika ◽  
2020 ◽  
Author(s):  
Zhenhua Lin ◽  
Jane-Ling Wang ◽  
Qixian Zhong

Summary Estimation of mean and covariance functions is fundamental for functional data analysis. While this topic has been studied extensively in the literature, a key assumption is that there are enough data in the domain of interest to estimate both the mean and covariance functions. In this paper, we investigate mean and covariance estimation for functional snippets in which observations from a subject are available only in an interval of length strictly (and often much) shorter than the length of the whole interval of interest. For such a sampling plan, no data is available for direct estimation of the off-diagonal region of the covariance function. We tackle this challenge via a basis representation of the covariance function. The proposed estimator enjoys a convergence rate that is adaptive to the smoothness of the underlying covariance function, and has superior finite-sample performance in simulation studies.


METRON ◽  
2021 ◽  
Author(s):  
Giovanni Saraceno ◽  
Claudio Agostinelli ◽  
Luca Greco

AbstractA weighted likelihood technique for robust estimation of multivariate Wrapped distributions of data points scattered on a $$p-$$ p - dimensional torus is proposed. The occurrence of outliers in the sample at hand can badly compromise inference for standard techniques such as maximum likelihood method. Therefore, there is the need to handle such model inadequacies in the fitting process by a robust technique and an effective downweighting of observations not following the assumed model. Furthermore, the employ of a robust method could help in situations of hidden and unexpected substructures in the data. Here, it is suggested to build a set of data-dependent weights based on the Pearson residuals and solve the corresponding weighted likelihood estimating equations. In particular, robust estimation is carried out by using a Classification EM algorithm whose M-step is enhanced by the computation of weights based on current parameters’ values. The finite sample behavior of the proposed method has been investigated by a Monte Carlo numerical study and real data examples.


2019 ◽  
Vol 70 (1) ◽  
pp. 26-29 ◽  
Author(s):  
Tinevimbo Shiri ◽  
Angela Loyse ◽  
Lawrence Mwenge ◽  
Tao Chen ◽  
Shabir Lakhi ◽  
...  

Abstract Background Mortality from cryptococcal meningitis remains very high in Africa. In the Advancing Cryptococcal Meningitis Treatment for Africa (ACTA) trial, 2 weeks of fluconazole (FLU) plus flucytosine (5FC) was as effective and less costly than 2 weeks of amphotericin-based regimens. However, many African settings treat with FLU monotherapy, and the cost-effectiveness of adding 5FC to FLU is uncertain. Methods The effectiveness and costs of FLU+5FC were taken from ACTA, which included a costing analysis at the Zambian site. The effectiveness of FLU was derived from cohorts of consecutively enrolled patients, managed in respects other than drug therapy, as were participants in ACTA. FLU costs were derived from costs of FLU+5FC in ACTA, by subtracting 5FC drug and monitoring costs. The cost-effectiveness of FLU+5FC vs FLU alone was measured as the incremental cost-effectiveness ratio (ICER). A probabilistic sensitivity analysis assessed uncertainties and a bivariate deterministic sensitivity analysis examined the impact of varying mortality and 5FC drug costs on the ICER. Results The mean costs per patient were US $847 (95% confidence interval [CI] $776–927) for FLU+5FC, and US $628 (95% CI $557–709) for FLU. The 10-week mortality rate was 35.1% (95% CI 28.9–41.7%) with FLU+5FC and 53.8% (95% CI 43.1–64.1%) with FLU. At the current 5FC price of US $1.30 per 500 mg tablet, the ICER of 5FC+FLU versus FLU alone was US $65 (95% CI $28–208) per life-year saved. Reducing the 5FC cost to between US $0.80 and US $0.40 per 500 mg resulted in an ICER between US $44 and US $28 per life-year saved. Conclusions The addition of 5FC to FLU is cost-effective for cryptococcal meningitis treatment in Africa and, if made available widely, could substantially reduce mortality rates among human immunodeficiency virus–infected persons in Africa.


Biometrika ◽  
2020 ◽  
Author(s):  
S Na ◽  
M Kolar ◽  
O Koyejo

Abstract Differential graphical models are designed to represent the difference between the conditional dependence structures of two groups, thus are of particular interest for scientific investigation. Motivated by modern applications, this manuscript considers an extended setting where each group is generated by a latent variable Gaussian graphical model. Due to the existence of latent factors, the differential network is decomposed into sparse and low-rank components, both of which are symmetric indefinite matrices. We estimate these two components simultaneously using a two-stage procedure: (i) an initialization stage, which computes a simple, consistent estimator, and (ii) a convergence stage, implemented using a projected alternating gradient descent algorithm applied to a nonconvex objective, initialized using the output of the first stage. We prove that given the initialization, the estimator converges linearly with a nontrivial, minimax optimal statistical error. Experiments on synthetic and real data illustrate that the proposed nonconvex procedure outperforms existing methods.


Symmetry ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 936
Author(s):  
Dan Wang

In this paper, a ratio test based on bootstrap approximation is proposed to detect the persistence change in heavy-tailed observations. This paper focuses on the symmetry testing problems of I(1)-to-I(0) and I(0)-to-I(1). On the basis of residual CUSUM, the test statistic is constructed in a ratio form. I prove the null distribution of the test statistic. The consistency under alternative hypothesis is also discussed. However, the null distribution of the test statistic contains an unknown tail index. To address this challenge, I present a bootstrap approximation method for determining the rejection region of this test. Simulation studies of artificial data are conducted to assess the finite sample performance, which shows that our method is better than the kernel method in all listed cases. The analysis of real data also demonstrates the excellent performance of this method.


Geophysics ◽  
2019 ◽  
Vol 84 (2) ◽  
pp. R165-R174 ◽  
Author(s):  
Marcelo Jorge Luz Mesquita ◽  
João Carlos Ribeiro Cruz ◽  
German Garabito Callapino

Estimation of an accurate velocity macromodel is an important step in seismic imaging. We have developed an approach based on coherence measurements and finite-offset (FO) beam stacking. The algorithm is an FO common-reflection-surface tomography, which aims to determine the best layered depth-velocity model by finding the model that maximizes a semblance objective function calculated from the amplitudes in common-midpoint (CMP) gathers stacked over a predetermined aperture. We develop the subsurface velocity model with a stack of layers separated by smooth interfaces. The algorithm is applied layer by layer from the top downward in four steps per layer. First, by automatic or manual picking, we estimate the reflection times of events that describe the interfaces in a time-migrated section. Second, we convert these times to depth using the velocity model via application of Dix’s formula and the image rays to the events. Third, by using ray tracing, we calculate kinematic parameters along the central ray and build a paraxial FO traveltime approximation for the FO common-reflection-surface method. Finally, starting from CMP gathers, we calculate the semblance of the selected events using this paraxial traveltime approximation. After repeating this algorithm for all selected CMP gathers, we use the mean semblance values as an objective function for the target layer. When this coherence measure is maximized, the model is accepted and the process is completed. Otherwise, the process restarts from step two with the updated velocity model. Because the inverse problem we are solving is nonlinear, we use very fast simulated annealing to search the velocity parameters in the target layers. We test the method on synthetic and real data sets to study its use and advantages.


Sign in / Sign up

Export Citation Format

Share Document