scholarly journals Testing the proportionality assumption for specified covariate in the cox model

Author(s):  
HAFDI Mohamed Ali

In this paper, I propose a test for proportional hazards assumption for specified covariates. The testis based on a general alternative in sense that hazards rates under different values of covariates therate is not only constant as in the Cox model, but it may cross, go away, and may be monotonicwith time. The limit distribution of the test statistic is derived. Finite samples properties of thetest power are analyzed by simulation. Application of the proposed test on Real data examples areconsidered.

Author(s):  
Constantin Ruhe

In many applications of the Cox model, the proportional-hazards assumption is implausible. In these cases, the solution to nonproportional hazards usually consists of modeling the effect of the variable of interest and its interaction effect with some function of time. Although Stata provides a command to implement this interaction in stcox, it does not allow the typical visualizations using stcurve if stcox was estimated with the tvc() option. In this article, I provide a short workaround that estimates the survival function after stcox with time-dependent coefficients. I introduce and describe the scurve_tvc command, which automates this procedure and allows users to easily visualize survival functions for models with time-varying effects.


2008 ◽  
Vol 56 (7) ◽  
pp. 954-957 ◽  
Author(s):  
Jeanette M. Tetrault ◽  
Maor Sauler ◽  
Carolyn K. Wells ◽  
John Concato

BackgroundMultivariable models are frequently used in the medical literature, but many clinicians have limited training in these analytic methods. Our objective was to assess the prevalence of multivariable methods in medical literature, quantify reporting of methodological criteria applicable to most methods, and determine if assumptions specific to logistic regression or proportional hazards analysis were evaluated.MethodsWe examined all original articles in Annals of Internal Medicine, British Medical Journal, Journal of the American Medical Association, Lancet, and New England Journal of Medicine, from January through June 2006. Articles reporting multivariable methods underwent a comprehensive review; reporting of methodological criteria was based on each article's primary analysis.ResultsAmong 452 articles, 272 (60%) used multivariable analysis; logistic regression (89 [33%] of 272) and proportional hazards (76 [28%] of 272) were most prominent. Reporting of methodological criteria, when applicable, ranged from 5% (12/265) for assessing influential observations to 84% (222/265) for description of variable coding. Discussion of interpreting odds ratios occurred in 13% (12/89) of articles reporting logistic regression as the primary method and discussion of the proportional hazards assumption occurred in 21% (16/76) of articles using Cox proportional hazards as the primary method.ConclusionsMore complete reporting of multivariable analysis in the medical literature can improve understanding, interpretation, and perhaps application of these methods.


Biometrika ◽  
2021 ◽  
Author(s):  
Juhyun Park ◽  
Jeongyoun Ahn ◽  
Yongho Jeon

Abstract Functional linear discriminant analysis offers a simple yet efficient method for classification, with the possibility of achieving a perfect classification. Several methods are proposed in the literature that mostly address the dimensionality of the problem. On the other hand, there is a growing interest in interpretability of the analysis, which favors a simple and sparse solution. In this work, we propose a new approach that incorporates a type of sparsity that identifies nonzero sub-domains in the functional setting, offering a solution that is easier to interpret without compromising performance. With the need to embed additional constraints in the solution, we reformulate the functional linear discriminant analysis as a regularization problem with an appropriate penalty. Inspired by the success of ℓ1-type regularization at inducing zero coefficients for scalar variables, we develop a new regularization method for functional linear discriminant analysis that incorporates an L1-type penalty, ∫ |f|, to induce zero regions. We demonstrate that our formulation has a well-defined solution that contains zero regions, achieving a functional sparsity in the sense of domain selection. In addition, the misclassification probability of the regularized solution is shown to converge to the Bayes error if the data are Gaussian. Our method does not presume that the underlying function has zero regions in the domain, but produces a sparse estimator that consistently estimates the true function whether or not the latter is sparse. Numerical comparisons with existing methods demonstrate this property in finite samples with both simulated and real data examples.


Symmetry ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 936
Author(s):  
Dan Wang

In this paper, a ratio test based on bootstrap approximation is proposed to detect the persistence change in heavy-tailed observations. This paper focuses on the symmetry testing problems of I(1)-to-I(0) and I(0)-to-I(1). On the basis of residual CUSUM, the test statistic is constructed in a ratio form. I prove the null distribution of the test statistic. The consistency under alternative hypothesis is also discussed. However, the null distribution of the test statistic contains an unknown tail index. To address this challenge, I present a bootstrap approximation method for determining the rejection region of this test. Simulation studies of artificial data are conducted to assess the finite sample performance, which shows that our method is better than the kernel method in all listed cases. The analysis of real data also demonstrates the excellent performance of this method.


Author(s):  
Lingtao Kong

The exponential distribution has been widely used in engineering, social and biological sciences. In this paper, we propose a new goodness-of-fit test for fuzzy exponentiality using α-pessimistic value. The test statistics is established based on Kullback-Leibler information. By using Monte Carlo method, we obtain the empirical critical points of the test statistic at four different significant levels. To evaluate the performance of the proposed test, we compare it with four commonly used tests through some simulations. Experimental studies show that the proposed test has higher power than other tests in most cases. In particular, for the uniform and linear failure rate alternatives, our method has the best performance. A real data example is investigated to show the application of our test.


2018 ◽  
Vol 5 (suppl_1) ◽  
pp. S426-S426
Author(s):  
Christopher M Rubino ◽  
Lukas Stulik ◽  
Harald Rouha ◽  
Zehra Visram ◽  
Adriana Badarau ◽  
...  

Abstract Background ASN100 is a combination of two co-administered fully human monoclonal antibodies (mAbs), ASN-1 and ASN-2, that together neutralize the six cytotoxins critical to S. aureus pneumonia pathogenesis. ASN100 is in development for prevention of S. aureus pneumonia in mechanically ventilated patients. A pharmacometric approach to dose discrimination in humans was taken in order to bridge from dose-ranging, survival studies in rabbits to anticipated human exposures using a mPBPK model derived from data from rabbits (infected and noninfected) and noninfected humans [IDWeek 2017, Poster 1849]. Survival in rabbits was assumed to be indicative of a protective effect through ASN100 neutralization of S. aureus toxins. Methods Data from studies in rabbits (placebo through 20 mg/kg single doses of ASN100, four strains representing MRSA and MSSA isolates with different toxin profiles) were pooled with data from a PK and efficacy study in infected rabbits (placebo and 40 mg/kg ASN100) [IDWeek 2017, Poster 1844]. A Cox proportional hazards model was used to relate survival to both strain and mAb exposure. Monte Carlo simulation was then applied to generate ASN100 exposures for simulated patients given a range of ASN100 doses and infection with each strain (n = 500 per scenario) using a mPBPK model. Using the Cox model, the probability of full protection from toxins (i.e., predicted survival) was estimated for each simulated patient. Results Cox models showed that survival in rabbits is dependent on both strain and ASN100 exposure in lung epithelial lining fluid (ELF). At human doses simulated (360–10,000 mg of ASN100), full or substantial protection is expected for all four strains tested. For the most virulent strain tested in the rabbit pneumonia study (a PVL-negative MSSA, Figure 1), the clinical dose of 3,600 mg of ASN100 provides substantially higher predicted effect relative to lower doses, while doses above 3,600 mg are not predicted to provide significant additional protection. Conclusion A pharmacometric approach allowed for the translation of rabbit survival data to infected patients as well as discrimination of potential clinical doses. These results support the ASN100 dose of 3,600 mg currently being evaluated in a Phase 2 S. aureus pneumonia prevention trial. Disclosures C. M. Rubino, Arsanis, Inc.: Research Contractor, Research support. L. Stulik, Arsanis Biosciences GmbH: Employee, Salary. H. Rouha, 3Arsanis Biosciences GmbH: Employee, Salary. Z. Visram, Arsanis Biosciences GmbH: Employee, Salary. A. Badarau, Arsanis Biosciences GmbH: Employee, Salary. S. A. Van Wart, Arsanis, Inc.: Research Contractor, Research support. P. G. Ambrose, Arsanis, Inc.: Research Contractor, Research support. M. M. Goodwin, Arsanis, Inc.: Employee, Salary. E. Nagy, Arsanis Biosciences GmbH: Employee, Salary.


2016 ◽  
Vol 37 (1) ◽  
Author(s):  
Hannelore Liero

A goodness-of-fit test for testing the acceleration function in a nonparametric life time model is proposed. For this aim the limit distribution of an L2-type test statistic is derived. Furthermore, a bootstrap method is considered and the power of the test is studied.


Sign in / Sign up

Export Citation Format

Share Document