Does Sample Attrition Affect the Assessment of Frailty Trajectories Among Older Adults? A Joint Model Approach

Gerontology ◽  
2018 ◽  
Vol 64 (5) ◽  
pp. 430-439 ◽  
Author(s):  
Erwin Stolz ◽  
Hannes Mayerl ◽  
Éva Rásky ◽  
Wolfgang Freidl

Background: Frailty constitutes an important risk factor for adverse outcomes among older adults. In longitudinal studies on frailty, selective sample attrition may threaten the validity of results. Objective: To assess the impact of sample attrition on frailty index trajectories and gaps related to socio-economic status (education) therein among older adults in Europe. Methods: A total of 64,143 observations from 21,044 respondents (50+) from the Survey of Health, Ageing and Retirement in Europe across 12 years of follow-up (2004–2015) and subject to substantial sample attrition (59%) were analysed. We compared results of a standard linear mixed model assuming missing at random (MAR) sample attrition with a joint model assuming missing not at random sample attrition. Results: Estimated frailty trajectories of both the mixed and joint models were identical up to an age of 80 years, above which modest underestimation occurred when a standard linear mixed model was used rather than a joint model. The latter effect was larger for men than women. Substantial education-based inequality in frailty continued throughout old age in both the mixed and joint models. Conclusion: Linear mixed models assuming MAR sample attrition provided good estimates of frailty trajectories up until high age. Thus, the validity of existing studies estimating frailty trajectories based on standard linear mixed models seems not threatened by substantial sample attrition.

2015 ◽  
Vol 26 (3) ◽  
pp. 1373-1388 ◽  
Author(s):  
Wei Liu ◽  
Norberto Pantoja-Galicia ◽  
Bo Zhang ◽  
Richard M Kotz ◽  
Gene Pennello ◽  
...  

Diagnostic tests are often compared in multi-reader multi-case (MRMC) studies in which a number of cases (subjects with or without the disease in question) are examined by several readers using all tests to be compared. One of the commonly used methods for analyzing MRMC data is the Obuchowski–Rockette (OR) method, which assumes that the true area under the receiver operating characteristic curve (AUC) for each combination of reader and test follows a linear mixed model with fixed effects for test and random effects for reader and the reader–test interaction. This article proposes generalized linear mixed models which generalize the OR model by incorporating a range-appropriate link function that constrains the true AUCs to the unit interval. The proposed models can be estimated by maximizing a pseudo-likelihood based on the approximate normality of AUC estimates. A Monte Carlo expectation-maximization algorithm can be used to maximize the pseudo-likelihood, and a non-parametric bootstrap procedure can be used for inference. The proposed method is evaluated in a simulation study and applied to an MRMC study of breast cancer detection.


2020 ◽  
Vol 58 (5) ◽  
pp. 915-922
Author(s):  
Ting-Tse Lin ◽  
Ming-Hsien Lin ◽  
Cho-Kai Wu ◽  
Lian-Yu Lin ◽  
Jou-Wei Lin ◽  
...  

Abstract OBJECTIVES Serial lactate (clearance) data are commonly used for risk stratification in patients receiving veno-arterial extracorporeal life support (ECLS). METHODS We retrospectively analysed 855 patients who had undergone ECLS due to cardiac (n = 578) and non-cardiac (n = 277) aetiologies between 2002 and 2013 at National Taiwan University Hospital. Serial lactate (clearance) data were collected before ECLS and at 8, 16, 24, 48 and 72 h after ECLS. To investigate the impact of lactate (clearance) levels on 180-day survival, we performed linear mixed model and joint model analyses using the Bayesian approach. RESULTS Among the 855 patients, 564 (65.9%) patients died within 180 days after ECLS cannulation. The joint model showed that the effect of lactate on survival was null in both the reduced model and the fully adjusted model. However, an effect of lactate clearance on survival was observed in the reduced model [estimate 0.004; 95% confidence interval (CI) 0.002–0.006] and the fully adjusted model (estimate 0.003; 95% CI 0.001–0.005). In a further secondary analysis, lactate clearance (hazard ratio 0.861; 95% CI 0.813–0.931) at 16 h after ECLS cannulation was determined to be a risk factor for mortality. According to a receiver operating characteristic curve analysis, the SAVE score combined with lactate clearance (area under curve = 0.881) showed good outcome discrimination. CONCLUSIONS Incorporating lactate clearance at 16 h after ECLS cannulation into the SAVE system improved the predictive value for mortality in patients receiving ECLS.


Coatings ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 187
Author(s):  
Adriana Santos ◽  
Elisabete F. Freitas ◽  
Susana Faria ◽  
Joel R. M. Oliveira ◽  
Ana Maria A. C. Rocha

The development of a linear mixed model to describe the degradation of friction on flexible road pavements to be included in pavement management systems is the aim of this study. It also aims at showing that, at the network level, factors such as temperature, rainfall, hypsometry, type of layer, and geometric alignment features may influence the degradation of friction throughout time. A dataset from six districts of Portugal with 7204 sections was made available by the Ascendi Concession highway network. Linear mixed models with random effects in the intercept were developed for the two-level and three-level datasets involving time, section and district. While the three-level models are region-specific, the two-level models offer the possibility to be adopted to other areas. For both levels, two approaches were made: One integrating into the model only the variables inherent to traffic and climate conditions and the other including also the factors intrinsic to the highway characteristics. The prediction accuracy of the model was improved when the variables hypsometry, geometrical features, and type of layer were considered. Therefore, accurate predictions for friction evolution throughout time are available to assist the network manager to optimize the overall level of road safety.


2017 ◽  
Author(s):  
Carl Kadie ◽  
David Heckerman

AbstractWe have developed Ludicrous Speed Linear Mixed Models, a version of FaST-LMM optimized for the cloud. The approach can perform a genome-wide association analysis on a dataset of one million SNPs across one million individuals at a cost of about 868 CPU days with an elapsed time on the order of two weeks. A Python implementation is available at https://fastlmm.github.io/.SignificanceIdentifying SNP-phenotype correlations using GWAS is difficult because effect sizes are so small for common, complex diseases. To address this issue, institutions are creating extremely large cohorts with sample sizes on the order of one million. Unfortunately, such cohorts are likely to contain confounding factors such as population structure and family/cryptic relatedness. The linear mixed model (LMM) can often correct for such confounding factors, but is too slow to use even with algebraic speedups known as FaST-LMM. We present a cloud implementation of FaST-LMM, called Ludicrous Speed LMM, that can process one million samples and one million test SNPs in a reasonable amount of time and at a reasonable cost.


Author(s):  
Osval Antonio Montesinos López ◽  
Abelardo Montesinos López ◽  
Jose Crossa

AbstractThe linear mixed model framework is explained in detail in this chapter. We explore three methods of parameter estimation (maximum likelihood, EM algorithm, and REML) and illustrate how genomic-enabled predictions are performed under this framework. We illustrate the use of linear mixed models by using the predictor several components such as environments, genotypes, and genotype × environment interaction. Also, the linear mixed model is illustrated under a multi-trait framework that is important in the prediction performance when the degree of correlation between traits is moderate or large. We illustrate the use of single-trait and multi-trait linear mixed models and provide the R codes for performing the analyses.


2021 ◽  
Author(s):  
Mohammed Sultan ◽  
Ritbano Ahmed

Abstract The linear mixed model is one of the common models used to analyze the longitudinal data;it may comprise of separate (Univariate), joint Bivariate, and joint Multivariate linear mixed model, which is predicted on the number of response variables incorporated in the analysis. Adjusting for correlation matrix and covariance matrix between and within subjects is one reason why modern longitudinal data analysis techniques are deemed more appropriate than some of the previous methods of analysis. Some studies assume that the correlation between observation is zero. However, it is unlikely that repeated measurements on the same individual Will actually be independent. To that end, comparing the different linear mixed models identifying the appropriate model demonstrates that the evolution of patients with congestive heart failure is necessary.In this study the separate, bivariate, and multivariate linear mixed models were compared with different covariance and correlation structures. Finally, a multivariate linear mixed model with autoregressive order one correlation structure and unstructured covariance structure for random effects, to consider within and between patient's variations, was considered as a best model to depict the evolution of patients with congestive heart failure.


2020 ◽  
Author(s):  
Chongliang Luo ◽  
Md. Nazmul Islam ◽  
Natalie E. Sheils ◽  
Jenna M Reps ◽  
John Buresh ◽  
...  

Linear mixed models (LMMs) are commonly used in many areas including epidemiology for analyzing multi-site data with heterogeneous site-specific random effects. However, due to the regulation of protecting patients' privacy, sensitive individual patient data (IPD) are usually not allowed to be shared across sites. In this paper we propose a novel algorithm for distributed linear mixed models (DLMMs). Our proposed DLMM algorithm can achieve exactly the same results as if we had pooled IPD from all sites, hence the lossless property. The DLMM algorithm requires each site to contribute some aggregated data (AD) in only one iteration. We apply the proposed DLMM algorithm to analyze the association of length of stay of COVID-19 hospitalization with demographic and clinical characteristics using the administrative claims database from the UnitedHealth Group Clinical Research Database.


2020 ◽  
Vol 13 (2) ◽  
pp. 143-149
Author(s):  
Nicholas C Chesnaye ◽  
Giovanni Tripepi ◽  
Friedo W Dekker ◽  
Carmine Zoccali ◽  
Aeilko H Zwinderman ◽  
...  

Abstract In nephrology, a great deal of information is measured repeatedly in patients over time, often alongside data on events of clinical interest. In this introductory article we discuss how these two types of data can be simultaneously analysed using the joint model (JM) framework, illustrated by clinical examples from nephrology. As classical survival analysis and linear mixed models form the two main components of the JM framework, we will also briefly revisit these techniques.


Author(s):  
Judith Rösler ◽  
Stefan Georgiev ◽  
Anna L. Roethe ◽  
Denny Chakkalakal ◽  
Güliz Acker ◽  
...  

AbstractExoscopic surgery promises alleviation of physical strain, improved intraoperative visualization and facilitation of the clinical workflow. In this prospective observational study, we investigate the clinical usability of a novel 3D4K-exoscope in routine neurosurgical interventions. Questionnaires on the use of the exoscope were carried out. Exemplary cases were additionally video-documented. All participating neurosurgeons (n = 10) received initial device training. Changing to a conventional microscope was possible at all times. A linear mixed model was used to analyse the impact of time on the switchover rate. For further analysis, we dichotomized the surgeons in a frequent (n = 1) and an infrequent (n = 9) user group. A one-sample Wilcoxon signed rank test was used to evaluate, if the number of surgeries differed between the two groups. Thirty-nine operations were included. No intraoperative complications occurred. In 69.2% of the procedures, the surgeon switched to the conventional microscope. While during the first half of the study the conversion rate was 90%, it decreased to 52.6% in the second half (p = 0.003). The number of interventions between the frequent and the infrequent user group differed significantly (p = 0.007). Main reasons for switching to ocular-based surgery were impaired hand–eye coordination and poor depth perception. The exoscope investigated in this study can be easily integrated in established neurosurgical workflows. Surgical ergonomics improved compared to standard microsurgical setups. Excellent image quality and precise control of the camera added to overall user satisfaction. For experienced surgeons, the incentive to switch from ocular-based to exoscopic surgery greatly varies.


Author(s):  
Amy L Petry ◽  
Nichole F Huntley ◽  
Michael R Bedford ◽  
John F Patience

Abstract In theory, supplementing xylanase in corn-based swine diets should improve nutrient and energy digestibility and fiber fermentability, but its efficacy is inconsistent. The experimental objective was to investigate the impact of xylanase on energy and nutrient digestibility, digesta viscosity, and fermentation when pigs are fed a diet high in insoluble fiber (>20% neutral detergent fiber; NDF) and given a 46-d dietary adaptation period. Three replicates of 20 growing gilts were blocked by initial body weight, individually housed, and assigned to 1 of 4 dietary treatments: a low-fiber control (LF) with 7.5% NDF, a 30% corn bran high-fiber control (HF; 21.9% NDF), HF+100 mg xylanase/kg [HF+XY, (Econase XT 25P; AB Vista, Marlborough, UK)] providing 16,000 birch xylan units/kg; and HF+50 mg arabinoxylan-oligosaccharide (AXOS) product/kg [HF+AX, (XOS 35A; Shandong Longlive Biotechnology, Shandong, China)] providing AXOS with 3-7 degrees of polymerization. Gilts were allowed ad libitum access to fed for 36-d. On d 36, pigs were housed in metabolism crates for a 10-d period, limit fed, and feces were collected. On d 46, pigs were euthanized and ileal, cecal, and colonic digesta were collected. Data were analyzed as a linear mixed model with block and replication as random effects, and treatment as a fixed effect. Compared with LF, HF reduced the apparent ileal digestibility (AID), apparent cecal digestibility (ACED), apparent colonic digestibility (ACOD), and apparent total tract digestibility (ATTD) of dry matter (DM), gross energy (GE), crude protein (CP), acid detergent fiber (ADF), NDF, and hemicellulose (P<0.01). Relative to HF, HF+XY improved the AID of GE, CP, and NDF (P<0.05), and improved the ACED, ACOD, and ATTD of DM, GE, CP, NDF, ADF, and hemicellulose (P<0.05). Among treatments, pigs fed HF had increased hindgut DM disappearance (P=0.031). Relative to HF, HF+XY improved cecal disappearance of DM (162 vs. 98g; P=0.008) and NDF (44 vs. 13g; P<0.01). Pigs fed xylanase had a greater proportion of acetate in cecal digesta and butyrate in colonic digesta among treatments (P<0.05). Compared with LF, HF increased ileal, cecal, and colonic viscosity, but HF+XY decreased ileal viscosity compared with HF (P<0.001). In conclusion, increased insoluble corn-based fiber decreases digestibility, reduces cecal fermentation, and increases digesta viscosity, but supplementing xylanase partially mitigated that effect.


Sign in / Sign up

Export Citation Format

Share Document