penalized spline
Recently Published Documents


TOTAL DOCUMENTS

137
(FIVE YEARS 42)

H-INDEX

19
(FIVE YEARS 2)

2022 ◽  
Vol 8 ◽  
Author(s):  
Younan Yao ◽  
Jin Liu ◽  
Bo Wang ◽  
Ziyou Zhou ◽  
Xiaozhao Lu ◽  
...  

Background: The prognostic value of elevated lipoprotein(a) [Lp(a)] in coronary artery disease (CAD) patients is inconsistent in previous studies, and whether such value changes at different low-density-lipoprotein cholesterol (LDL-C) levels is unclear.Methods and Findings: CAD patients treated with statin therapy from January 2007 to December 2018 in the Guangdong Provincial People's Hospital (NCT04407936) were consecutively enrolled. Individuals were categorized according to the baseline LDL-C at cut-off of 70 and 100 mg/dL. The primary outcome was 5-year all-cause death. Multivariate Cox proportional models and penalized spline analyses were used to evaluate the association between Lp(a) and all-cause mortality. Among 30,908 patients, the mean age was 63.1 ± 10.7 years, and 76.7% were men. A total of 2,383 (7.7%) patients died at 5-year follow-up. Compared with Lp(a) <50 mg/dL, Lp(a) ≥ 50 mg/dL predicted higher all-cause mortality (multivariable adjusted HR = 1.19, 95% CI 1.07–1.31) in the total cohort. However, when analyzed within each LDL-C category, there was no significant association between Lp(a) ≥ 50 mg/dL and higher all-cause mortality unless the baseline LDL-C was ≥ 100 mg/dL (HR = 1.19, 95% CI 1.04–1.36). The results from penalized spline analyses were robust.Conclusions: In statin-treated CAD patients, elevated Lp(a) was associated with increased risks of all-cause death, and such an association was modified by the baseline LDL-C levels. Patients with Lp(a) ≥ 50 mg/dL had higher long-term risks of all-cause death compared with those with Lp(a) <50 mg/dL only when their baseline LDL-C was ≥ 100 mg/dL.


Author(s):  
Nathanael R. Fillmore ◽  
Jennifer La ◽  
Chunlei Zheng ◽  
Shira Doron ◽  
Nhan Do ◽  
...  

Abstract Background: COVID-19 hospitalization definitions do not include a disease severity assessment. Thus, we sought to identify a simple and objective mechanism for identifying hospitalized severe cases and to measure the impact of vaccination on trends. Methods: All admissions to a Veterans Affairs (VA) hospital, where routine screening is recommended, between 3/1/2020-11/22/2021 with SARS-CoV-2 were included. Moderate-to-severe COVID-19 was defined as any oxygen supplementation or any SpO2 <94% between one day before and two weeks after the positive SARS-CoV-2 test. Admissions with moderate-to-severe disease were divided by the total number of admissions, and the proportion of admissions with moderate-to-severe COVID-19 was modelled using a penalized spline in a Poisson regression and stratified by vaccination status. Dexamethasone receipt and its correlation with moderate-to-severe cases was also assessed. Results: Among 67,025 admissions with SARS-CoV-2, the proportion with hypoxemia or supplemental oxygen fell from 64% prior to vaccine availability to 56% by November 2021, driven in part by lower rates in vaccinated patients (vaccinated, 52% versus unvaccinated, 58%). The proportion of cases of moderate-to-severe disease identified using SpO2 levels and oxygen supplementation was highly correlated with dexamethasone receipt (correlation coefficient, 0.95), and increased after 7/1/2021, concurrent with delta variant predominance. Conclusions: A simple and objective definition of COVID-19 hospitalizations using SpO2 levels and oxygen supplementation can be used to track pandemic severity. This metric could be used to identify risk factors for severe breakthrough infections, to guide clinical treatment algorithms, and to detect trends in changes in vaccine effectiveness over time and against new variants.


2022 ◽  
Author(s):  
Nathanael Fillmore ◽  
Jennifer La ◽  
Chunlei Zheng ◽  
Shira Doron ◽  
Nhan Do ◽  
...  

Abstract Background: COVID-19 hospitalization definitions do not include a disease severity assessment. Thus, we sought to identify a simple and objective mechanism for identifying hospitalized severe cases and to measure the impact of vaccination on trends.Methods: All admissions to a Veterans Affairs (VA) hospital, where routine screening is recommended, between 3/1/2020-11/22/2021 with SARS-CoV-2 were included. Moderate-to-severe COVID-19 was defined as any oxygen supplementation or any SpO2 <94% between one day before and two weeks after the positive SARS-CoV-2 test. Admissions with moderate-to-severe disease were divided by the total number of admissions, and the proportion of admissions with moderate-to-severe COVID-19 was modelled using a penalized spline in a Poisson regression and stratified by vaccination status. Dexamethasone receipt and its correlation with moderate-to-severe cases was also assessed. Results: Among 67,025 admissions with SARS-CoV-2, the proportion with hypoxemia or supplemental oxygen fell from 64% prior to vaccine availability to 56% by November 2021, driven in part by lower rates in vaccinated patients (vaccinated, 52% versus unvaccinated, 58%). The proportion of cases of moderate-to-severe disease identified using SpO2 levels and oxygen supplementation was highly correlated with dexamethasone receipt (correlation coefficient, 0.95), and increased after 7/1/2021, concurrent with delta variant predominance.Conclusions: A simple and objective definition of COVID-19 hospitalizations using SpO2 levels and oxygen supplementation can be used to track pandemic severity. This metric could be used to identify risk factors for severe breakthrough infections, to guide clinical treatment algorithms, and to detect trends in changes in vaccine effectiveness over time and against new variants.


Stats ◽  
2021 ◽  
Vol 4 (2) ◽  
pp. 529-549
Author(s):  
Tingting Zhou ◽  
Michael R. Elliott ◽  
Roderick J. A. Little

Without randomization of treatments, valid inference of treatment effects from observational studies requires controlling for all confounders because the treated subjects generally differ systematically from the control subjects. Confounding control is commonly achieved using the propensity score, defined as the conditional probability of assignment to a treatment given the observed covariates. The propensity score collapses all the observed covariates into a single measure and serves as a balancing score such that the treated and control subjects with similar propensity scores can be directly compared. Common propensity score-based methods include regression adjustment and inverse probability of treatment weighting using the propensity score. We recently proposed a robust multiple imputation-based method, penalized spline of propensity for treatment comparisons (PENCOMP), that includes a penalized spline of the assignment propensity as a predictor. Under the Rubin causal model assumptions that there is no interference across units, that each unit has a non-zero probability of being assigned to either treatment group, and there are no unmeasured confounders, PENCOMP has a double robustness property for estimating treatment effects. In this study, we examine the impact of using variable selection techniques that restrict predictors in the propensity score model to true confounders of the treatment-outcome relationship on PENCOMP. We also propose a variant of PENCOMP and compare alternative approaches to standard error estimation for PENCOMP. Compared to the weighted estimators, PENCOMP is less affected by inclusion of non-confounding variables in the propensity score model. We illustrate the use of PENCOMP and competing methods in estimating the impact of antiretroviral treatments on CD4 counts in HIV+ patients.


Author(s):  
Martin Siebenborn ◽  
Julian Wagner

AbstractPenalized spline smoothing is a well-established, nonparametric regression method that is efficient for one and two covariates. Its extension to more than two covariates is straightforward but suffers from exponentially increasing memory demands and computational complexity, which brings the method to its numerical limit. Penalized spline smoothing with multiple covariates requires solving a large-scale, regularized least-squares problem where the occurring matrices do not fit into storage of common computer systems. To overcome this restriction, we introduce a matrix-free implementation of the conjugate gradient method. We further present a matrix-free implementation of a simple diagonal as well as more advanced geometric multigrid preconditioner to significantly speed up convergence of the conjugate gradient method. All algorithms require a negligible amount of memory and therefore allow for penalized spline smoothing with multiple covariates. Moreover, for arbitrary but fixed covariate dimension, we show grid independent convergence of the multigrid preconditioner which is fundamental to achieve algorithmic scalability.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Muhammad Abu Shadeque Mullah ◽  
James A. Hanley ◽  
Andrea Benedetti

Abstract Background Generalized linear mixed models (GLMMs), typically used for analyzing correlated data, can also be used for smoothing by considering the knot coefficients from a regression spline as random effects. The resulting models are called semiparametric mixed models (SPMMs). Allowing the random knot coefficients to follow a normal distribution with mean zero and a constant variance is equivalent to using a penalized spline with a ridge regression type penalty. We introduce the least absolute shrinkage and selection operator (LASSO) type penalty in the SPMM setting by considering the coefficients at the knots to follow a Laplace double exponential distribution with mean zero. Methods We adopt a Bayesian approach and use the Markov Chain Monte Carlo (MCMC) algorithm for model fitting. Through simulations, we compare the performance of curve fitting in a SPMM using a LASSO type penalty to that of using ridge penalty for binary data. We apply the proposed method to obtain smooth curves from data on the relationship between the amount of pack years of smoking and the risk of developing chronic obstructive pulmonary disease (COPD). Results The LASSO penalty performs as well as ridge penalty for simple shapes of association and outperforms the ridge penalty when the shape of association is complex or linear. Conclusion We demonstrated that LASSO penalty captured complex dose-response association better than the Ridge penalty in a SPMM.


2021 ◽  
Vol 67 (1) ◽  
pp. 1-13
Author(s):  
Mauricio Zapata-Cuartas ◽  
Bronson P Bullock ◽  
Cristian R Montes

Abstract Stem profile needs to be modeled with an accurate taper equation to produce reliable tree volume assessments. We propose a semiparametric method where few a priori functional form assumptions or parametric specification are required. We compared the diameter and volume predictions of a penalized spline regression (P-spline), P-spline extended with an additive dbh-class variable, and six alternative parametric taper equations including single, segmented, and variable-exponent equation forms. We used taper data from 147 loblolly pine (Pinus taeda L.) trees to fit the models and make comparisons. Here we show that the extended P-spline outperforms the parametric taper equations when used to predict outside bark diameter in the lower portion of the stem, up to 40% of the tree height where the more valuable wood products (62% of the total outside bark volume) are located. For volume, both P-spline models perform equal or better than the best parametric model, with taper calibration, which could result in possible savings on inventory costs by not requiring an additional measurement. Our findings suggest that assuming a priori fixed form in taper models imposes restrictions that fail to explain the tree form adequately compared with the proposed P-spline.


Sign in / Sign up

Export Citation Format

Share Document