scholarly journals Partitioned and Non-Partitioned Regularized Additive Hazard Models with and Without Spatial Dependence

2021 ◽  
Vol 3 (2) ◽  
pp. 1-19
Author(s):  
Peter Enesi Omaku ◽  
Benjamin Agboola Oyejola

Spatial effects are often simultaneously investigated with non-linear effects of continuous covariates and the usual linear effect. In this work the performance of models with and without spatial dependence in partitioned (PM) and non-partitioned models (NPM) for four (4) censoring percentages, three(3) levels of Weibull baseline variances (WBV), and sample sizes 100, 500 & 1000 were investigated. Hazard models were adapted to the generalized additive predictors and analyses were carried out via MCMC simulation technique. The performances of the models were again assessed when fitted to the diabetic data set. Results suggest that; partition models outperformed the non-partition ones. Models with spatial dependence perform better than models without spatial dependence in denser event times and when WBVs are low. The partition models perform better with spatial dependence than the Non-partitioned models. For the diabetic data set, it is seen that covariates Age and Blood Sugar level (BSL) violates the proportionality assumptions upon test. Further assessment from the graph of coefficient against time; suggest that Age be put to cut-points while BSL was estimated for models with and without Penalized splines for the sake of comparison, since the graph shows just a slight deviation from proportionality. Hazard rates for the time varying Age; indicate that as the time of study rolls by, the hazard of experiencing the event death from the disease increases steadily between intervals but constant within each time interval. A unit change in hazard rate for BSL indicates a decrease for PM implemented for with and without penalized splines. The model without penalized splines was however, seen to be better with smaller DIC (Deviance Information Criteria) value. Marriage is seen to be significant in the management of the disease in comparison to single patients. In addition patients are advised to visit their physicians on a regular basis to run a routine check to keep their BSL in good range. The study provides a means of moving out of non-linear ruts in survival data analysis. Intervals increase sample sizes (pseudoobservations), which in turn improves the modified Partitioned model when they are with or without spatial dependence.

2019 ◽  
Vol 11 (2) ◽  
pp. 544 ◽  
Author(s):  
Ling Zhang ◽  
He Wang ◽  
Yan Song ◽  
Haizhen Wen

This study investigates the spatial dependence of house prices in the Yangtze Delta Urban Agglomeration since the year 2000. According to Moran’s I index and the LISA scatter plot derived from a cross-section data set, the spatial dependence of house prices can be traced across the 25 cities in the agglomeration and became more evident after 2005. This study develops a spatial panel model with geographical distance and economic distance weight matrices. Spatial effects significantly influenced house prices in both cases but the intensity of the former was weaker than for the latter. Income, proportion of the tertiary industry, and amenity exhibited significant indirect effects on house prices in other cities in the inner region of the agglomeration, while competition of population between cities with economic proximity exerted negative indirect effects. Furthermore, urban industrial structure, innovation capability, and urbanization degree revealed differences in terms of spatial dependence among various city groups.


Author(s):  
ALEXANDRE C. MENDES ◽  
NASSER FARD

This study proposes a modification for the binary logistic regression to treat time-dependent covariates for reliability studies. The proportional hazard model (PHM) properties are well suited for modeling survival data when there are categorical predictors; as it compares hazards to a reference category. However, time-dependent covariates present a challenge for the analysis as stratification does not produce hazards for the covariate stratified or creation of dummy time-dependent covariates faces difficulty on selecting the time interval for the interaction and the coefficient results may be difficult to interpret. The findings show that the logistic regression can provide equal or better results than the PHM applied for reliability analysis when time-dependent covariate is evaluated. The PHM is potentially preferred to address data set without time-dependent variables as it does not require any data manipulation. The logistic regression ignores the information on timing of the events; which is corrected by breaking each subject survival history into a set of discrete time intervals that are treated as distinct observations evaluated as a binary distribution. Recurrent events can be addressed by both methods with proper correction for lack of heterogeneity. The application of the modified logistic regression model for the study of reliability is innovative and with readily potential application for step-stress time-dependent accelerated life testing.


2020 ◽  
pp. 133-158
Author(s):  
K. A. Kholodilin ◽  
Y. I. Yanzhimaeva

A relative uniformity of population distribution on the territory of the country is of importance from socio-economic and strategic perspectives. It is especially important in the case of Russia with its densely populated West and underpopulated East. This paper considers changes in population density in Russian regions, which occurred between 1897 and 2017. It explores whether there was convergence in population density and what factors influenced it. For this purpose, it uses the data both at county and regional levels, which are brought to common borders for comparability purposes. Further, the models of unconditional and conditional β-convergence are estimated, taking into account the spatial dependence. The paper concludes that the population density equalization took place in 1897-2017 at the county level and in 1926—1970 at the regional level. In addition, the population density increase is shown to be influenced not only by spatial effects, but also by political and geographical factors such as climate, number of GULAG camps, and the distance from the capital city.


2020 ◽  
Vol 16 (8) ◽  
pp. 1088-1105
Author(s):  
Nafiseh Vahedi ◽  
Majid Mohammadhosseini ◽  
Mehdi Nekoei

Background: The poly(ADP-ribose) polymerases (PARP) is a nuclear enzyme superfamily present in eukaryotes. Methods: In the present report, some efficient linear and non-linear methods including multiple linear regression (MLR), support vector machine (SVM) and artificial neural networks (ANN) were successfully used to develop and establish quantitative structure-activity relationship (QSAR) models capable of predicting pEC50 values of tetrahydropyridopyridazinone derivatives as effective PARP inhibitors. Principal component analysis (PCA) was used to a rational division of the whole data set and selection of the training and test sets. A genetic algorithm (GA) variable selection method was employed to select the optimal subset of descriptors that have the most significant contributions to the overall inhibitory activity from the large pool of calculated descriptors. Results: The accuracy and predictability of the proposed models were further confirmed using crossvalidation, validation through an external test set and Y-randomization (chance correlations) approaches. Moreover, an exhaustive statistical comparison was performed on the outputs of the proposed models. The results revealed that non-linear modeling approaches, including SVM and ANN could provide much more prediction capabilities. Conclusion: Among the constructed models and in terms of root mean square error of predictions (RMSEP), cross-validation coefficients (Q2 LOO and Q2 LGO), as well as R2 and F-statistical value for the training set, the predictive power of the GA-SVM approach was better. However, compared with MLR and SVM, the statistical parameters for the test set were more proper using the GA-ANN model.


2003 ◽  
Vol 42 (05) ◽  
pp. 564-571 ◽  
Author(s):  
M. Schumacher ◽  
E. Graf ◽  
T. Gerds

Summary Objectives: A lack of generally applicable tools for the assessment of predictions for survival data has to be recognized. Prediction error curves based on the Brier score that have been suggested as a sensible approach are illustrated by means of a case study. Methods: The concept of predictions made in terms of conditional survival probabilities given the patient’s covariates is introduced. Such predictions are derived from various statistical models for survival data including artificial neural networks. The idea of how the prediction error of a prognostic classification scheme can be followed over time is illustrated with the data of two studies on the prognosis of node positive breast cancer patients, one of them serving as an independent test data set. Results and Conclusions: The Brier score as a function of time is shown to be a valuable tool for assessing the predictive performance of prognostic classification schemes for survival data incorporating censored observations. Comparison with the prediction based on the pooled Kaplan Meier estimator yields a benchmark value for any classification scheme incorporating patient’s covariate measurements. The problem of an overoptimistic assessment of prediction error caused by data-driven modelling as it is, for example, done with artificial neural nets can be circumvented by an assessment in an independent test data set.


Fluids ◽  
2018 ◽  
Vol 3 (3) ◽  
pp. 63 ◽  
Author(s):  
Thomas Meunier ◽  
Claire Ménesguen ◽  
Xavier Carton ◽  
Sylvie Le Gentil ◽  
Richard Schopp

The stability properties of a vortex lens are studied in the quasi geostrophic (QG) framework using the generalized stability theory. Optimal perturbations are obtained using a tangent linear QG model and its adjoint. Their fine-scale spatial structures are studied in details. Growth rates of optimal perturbations are shown to be extremely sensitive to the time interval of optimization: The most unstable perturbations are found for time intervals of about 3 days, while the growth rates continuously decrease towards the most unstable normal mode, which is reached after about 170 days. The horizontal structure of the optimal perturbations consists of an intense counter-shear spiralling. It is also extremely sensitive to time interval: for short time intervals, the optimal perturbations are made of a broad spectrum of high azimuthal wave numbers. As the time interval increases, only low azimuthal wave numbers are found. The vertical structures of optimal perturbations exhibit strong layering associated with high vertical wave numbers whatever the time interval. However, the latter parameter plays an important role in the width of the vertical spectrum of the perturbation: short time interval perturbations have a narrow vertical spectrum while long time interval perturbations show a broad range of vertical scales. Optimal perturbations were set as initial perturbations of the vortex lens in a fully non linear QG model. It appears that for short time intervals, the perturbations decay after an initial transient growth, while for longer time intervals, the optimal perturbation keeps on growing, quickly leading to a non-linear regime or exciting lower azimuthal modes, consistent with normal mode instability. Very long time intervals simply behave like the most unstable normal mode. The possible impact of optimal perturbations on layering is also discussed.


2006 ◽  
Vol 39 (2) ◽  
pp. 262-266 ◽  
Author(s):  
R. J. Davies

Synchrotron sources offer high-brilliance X-ray beams which are ideal for spatially and time-resolved studies. Large amounts of wide- and small-angle X-ray scattering data can now be generated rapidly, for example, during routine scanning experiments. Consequently, the analysis of the large data sets produced has become a complex and pressing issue. Even relatively simple analyses become difficult when a single data set can contain many thousands of individual diffraction patterns. This article reports on a new software application for the automated analysis of scattering intensity profiles. It is capable of batch-processing thousands of individual data files without user intervention. Diffraction data can be fitted using a combination of background functions and non-linear peak functions. To compliment the batch-wise operation mode, the software includes several specialist algorithms to ensure that the results obtained are reliable. These include peak-tracking, artefact removal, function elimination and spread-estimate fitting. Furthermore, as well as non-linear fitting, the software can calculate integrated intensities and selected orientation parameters.


2011 ◽  
Vol 34 (7) ◽  
pp. 841-849 ◽  
Author(s):  
Shuping He ◽  
Fei Liu

In this paper we study the robust control problems with respect to the finite-time interval of uncertain non-linear Markov jump systems. By means of Takagi–Sugeno fuzzy models, the overall closed-loop fuzzy dynamics are constructed through selected membership functions. By using the stochastic Lyapunov–Krasovskii functional approach, a sufficient condition is firstly established on the stochastic robust finite-time stabilization. Then, in terms of linear matrix inequalities techniques, the sufficient conditions on the existence of the stochastic finite-time controller are presented and proved. Finally, the design problem is formulated as an optimization one. The simulation results illustrate the effectiveness of the proposed approaches.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Luca Marengo ◽  
Wolfgang Ummenhofer ◽  
Gerster Pascal ◽  
Falko Harm ◽  
Marc Lüthy ◽  
...  

Introduction: Agonal respiration has been shown to be commonly associated with witnessed events, ventricular fibrillation, and increased survival during out-of-hospital cardiac arrest. There is little information on incidence of gasping for in-hospital cardiac arrest (IHCA). Our “Rapid Response Team” (RRT) missions were monitored between December 2010 and March 2015, and the prevalence of gasping and survival data for IHCA were investigated. Methods: A standardized extended in-hospital Utstein data set of all RRT-interventions occurring at the University Hospital Basel, Switzerland, from December 13, 2010 until March 31, 2015 was consecutively collected and recorded in Microsoft Excel (Microsoft Corp., USA). Data were analyzed using IBM SPSS Statistics 22.0 (IBM Corp., USA), and are presented as descriptive statistics. Results: The RRT was activated for 636 patients, with 459 having a life-threatening status (72%; 33 missing). 270 patients (59%) suffered IHCA. Ventricular fibrillation or pulseless ventricular tachycardia occurred in 42 patients (16% of CA) and were associated with improved return of spontaneous circulation (ROSC) (36 (97%) vs. 143 (67%; p<0.001)), hospital discharge (25 (68%) vs. 48 (23%; p<0.001)), and discharge with good neurological outcome (Cerebral Performance Categories of 1 or 2 (CPC) (21 (55%) vs. 41 (19%; p<0.001)). Gasping was seen in 128 patients (57% of CA; 46 missing) and was associated with an overall improved ROSC (99 (78%) vs. 55 (59%; p=0.003)). In CAs occurring on the ward (154, 57% of all CAs), gasping was associated with a higher proportion of shockable rhythms (11 (16%) vs. 2 (3%; p=0.019)), improved ROSC (62 (90%) vs. 34 (55%; p<0.001)), and hospital discharge (21 (32%) vs. 7 (11%; p=0.006)). Gasping was not associated with neurological outcome. Conclusions: Gasping was frequently observed accompanying IHCA. The faster in-hospital patient access is probably the reason for the higher prevalence compared to the prehospital setting. For CA on the ward without continuous monitoring, gasping correlates with increased shockable rhythms, ROSC, and hospital discharge.


Economies ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 49 ◽  
Author(s):  
Waqar Badshah ◽  
Mehmet Bulut

Only unstructured single-path model selection techniques, i.e., Information Criteria, are used by Bounds test of cointegration for model selection. The aim of this paper was twofold; one was to evaluate the performance of these five routinely used information criteria {Akaike Information Criterion (AIC), Akaike Information Criterion Corrected (AICC), Schwarz/Bayesian Information Criterion (SIC/BIC), Schwarz/Bayesian Information Criterion Corrected (SICC/BICC), and Hannan and Quinn Information Criterion (HQC)} and three structured approaches (Forward Selection, Backward Elimination, and Stepwise) by assessing their size and power properties at different sample sizes based on Monte Carlo simulations, and second was the assessment of the same based on real economic data. The second aim was achieved by the evaluation of the long-run relationship between three pairs of macroeconomic variables, i.e., Energy Consumption and GDP, Oil Price and GDP, and Broad Money and GDP for BRICS (Brazil, Russia, India, China and South Africa) countries using Bounds cointegration test. It was found that information criteria and structured procedures have the same powers for a sample size of 50 or greater. However, BICC and Stepwise are better at small sample sizes. In the light of simulation and real data results, a modified Bounds test with Stepwise model selection procedure may be used as it is strongly theoretically supported and avoids noise in the model selection process.


Sign in / Sign up

Export Citation Format

Share Document