scholarly journals Functional Movement Assessments Are Not Associated with Risk of Injury During Military Basic Training

2019 ◽  
Vol 184 (11-12) ◽  
pp. e773-e780 ◽  
Author(s):  
Sarah J de la Motte ◽  
Daniel R Clifton ◽  
Timothy C Gribbin ◽  
Anthony I Beutler ◽  
Patricia A Deuster

Abstract Introduction Musculoskeletal injuries (MSK-I) in the U.S. military accounted for more than four million medical encounters in 2017. The Military Entrance Processing Screen to Assess Risk of Training (MEPSTART) was created to identify MSK-I risk during the first 180 days of military service. Methods Active duty applicants to the United States Army, Navy, Air Force, and Marine Corps between February 2013 and December 2014 who consented completed a behavioral and injury history questionnaire and the MEPSTART screen [Functional Movement Screen (FMS), Y-Balance Test (YBT), Landing Error Scoring System (LESS), and Overhead Squat assessment (OHS)] the day they shipped to basic training. Male (n = 1,433) and Female (n = 281) applicants were enrolled and MSK-I were tracked for 180 days. Binomial logistic regression and multivariate Cox proportional hazards modeling were used to assess relationships among MEPSTART screens and MSK-I independent of age, BMI, sex, Service, injury history, and smoking status. Analyses were finalized and performed in 2017. Results The only functional screen related to injury was the LESS score. Compared to those with good LESS scores, applicants with poor LESS scores had lower odds of MSK-I (OR = 0.54, 95% CI = 0.30–0.97, p = 0.04), and a lower instantaneous risk of MSK-I during the first 180 d (HR = 0.58, 95%CI = 0.34–0.96, p = 0.04). However, secondary receiver operator characteristic (ROC) analyses revealed poor discriminative value (AUC = 0.49, 95%CI = 0.43–0.54). Conclusions Functional performance did not predict future injury risk during the first 180 days of service. Poor LESS scores were associated with lower injury risk, but ROC analyses revealed little predictive value and limited clinical usefulness. Comprehensive risk reduction strategies may be preferable for mitigating MSK-I in military training populations.

2019 ◽  
Vol 188 (11) ◽  
pp. 1977-1983 ◽  
Author(s):  
Tianshi David Wu ◽  
Chinedu O Ejike ◽  
Robert A Wise ◽  
Meredith C McCormack ◽  
Emily P Brigham

Abstract An obesity paradox in chronic obstructive pulmonary disease (COPD), whereby overweight/obese individuals have improved survival, has been well-described. These studies have generally included smokers. It is unknown whether the paradox exists in individuals with COPD arising from factors other than smoking. Nonsmoking COPD is understudied yet represents some 25%–45% of the disease worldwide. To determine whether the obesity paradox differs between ever- and never-smokers with COPD, 1,723 adult participants with this condition were examined from 2 iterations of the National Health and Nutrition Examination Survey (1988–1994, 2007–2010), with mortality outcomes followed through December 2011. Using Cox proportional hazards models, adjusted for sociodemographic factors, lung function, and survey cycle, ever/never-smoking was found to modify the association between body mass index and hazard of death. Compared with normal-weight participants, overweight/obese participants had lower hazard of death among ever-smokers (for overweight, adjusted hazard ratio (aHR) = 0.56, 95% confidence interval (CI): 0.43, 0.74; for obesity, aHR = 0.66, 95% CI: 0.48, 0.92), but never-smokers did not (overweight, aHR = 1.41, 95% CI: 0.66, 3.03; obesity, aHR = 1.29, 95% CI: 0.48, 3.48). An obesity paradox appeared to be absent among never-smokers with COPD. This, to our knowledge, novel finding might be explained by pathophysiological differences between smoking-related and nonsmoking COPD or by smoking-associated methodological biases.


PLoS ONE ◽  
2021 ◽  
Vol 16 (6) ◽  
pp. e0252719
Author(s):  
Achal P. Patel ◽  
Suril S. Mehta ◽  
Alexandra J. White ◽  
Nicole M. Niehoff ◽  
Whitney D. Arroyave ◽  
...  

Background Polycyclic aromatic hydrocarbons (PAHs) are ubiquitous organic compounds associated with chronic disease in epidemiologic studies, though the contribution of PAH exposure on fatal outcomes in the U.S. is largely unknown. Objectives We investigated urinary hydroxylated PAH metabolites (OH-PAHs) with all-cause and cause-specific mortality in a representative sample of the U.S. population. Methods Study participants were ≥20 years old from the National Health and Nutrition Examination Survey 2001–2014. Concentrations (nmol/L) of eight OH-PAHs from four parent PAHs (naphthalene, fluorene, phenanthrene, pyrene) were measured in spot urine samples at examination. We identified all-cause, cancer-specific, and cardiovascular-specific deaths through 2015 using the National Death Index. We used Cox proportional hazards regression to estimate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs) for the association between ΣOH-PAHs and mortality endpoints. We assessed potential heterogeneity by age, gender, smoking status, poverty, and race/ethnicity. Additionally, we examined the overall mixture effect using quantile g-computation. Results In 9,739 eligible participants, there were 934 all-cause deaths, 159 cancer-specific deaths, and 108 cardiovascular-specific deaths (median 6.75 years follow-up). A log10 increase in ΣOH-PAHs was associated with higher all-cause mortality (HRadj = 1.39 [95%CI: 1.21, 1.61]), and possibly cancer-specific mortality (HRadj = 1.15 [95%CI: 0.79, 1.69]), and cardiovascular-specific mortality (HRadj = 1.49 [95%CI: 0.94, 2.33]). We observed substantial effect modification by age, smoking status, gender, and race/ethnicity across mortality endpoints. Risk of cardiovascular mortality was higher for non-Hispanic blacks and those in poverty, indicating potential disparities. Quantile g-computation joint associations for a simultaneous quartile increase in OH-PAHs were HRadj = 1.15 [95%CI: 1.02, 1.31], HRadj = 1.41 [95%CI: 1.05, 1.90], and HRadj = 0.98 [95%CI: 0.66, 1.47] for all-cause, cancer-specific, and cardiovascular-specific mortalities, respectively. Discussion Our results support a role for total PAH exposure in all-cause and cause-specific mortality in the U.S. population.


Author(s):  
Laurie Grieshober ◽  
Stefan Graw ◽  
Matt J. Barnett ◽  
Gary E. Goodman ◽  
Chu Chen ◽  
...  

Abstract Purpose The neutrophil-to-lymphocyte ratio (NLR) is a marker of systemic inflammation that has been reported to be associated with survival after chronic disease diagnoses, including lung cancer. We hypothesized that the inflammatory profile reflected by pre-diagnosis NLR, rather than the well-studied pre-treatment NLR at diagnosis, may be associated with increased mortality after lung cancer is diagnosed in high-risk heavy smokers. Methods We examined associations between pre-diagnosis methylation-derived NLR (mdNLR) and lung cancer-specific and all-cause mortality in 279 non-small lung cancer (NSCLC) and 81 small cell lung cancer (SCLC) cases from the β-Carotene and Retinol Efficacy Trial (CARET). Cox proportional hazards models were adjusted for age, sex, smoking status, pack years, and time between blood draw and diagnosis, and stratified by stage of disease. Models were run separately by histotype. Results Among SCLC cases, those with pre-diagnosis mdNLR in the highest quartile had 2.5-fold increased mortality compared to those in the lowest quartile. For each unit increase in pre-diagnosis mdNLR, we observed 22–23% increased mortality (SCLC-specific hazard ratio [HR] = 1.23, 95% confidence interval [CI]: 1.02, 1.48; all-cause HR = 1.22, 95% CI 1.01, 1.46). SCLC associations were strongest for current smokers at blood draw (Interaction Ps = 0.03). Increasing mdNLR was not associated with mortality among NSCLC overall, nor within adenocarcinoma (N = 148) or squamous cell carcinoma (N = 115) case groups. Conclusion Our findings suggest that increased mdNLR, representing a systemic inflammatory profile on average 4.5 years before a SCLC diagnosis, may be associated with mortality in heavy smokers who go on to develop SCLC but not NSCLC.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 4530-4530
Author(s):  
Sarah Fleming ◽  
Dina Gifkins ◽  
Waleed Shalaby ◽  
Jianjun Gao ◽  
Philip Rosenberg ◽  
...  

4530 Background: FGFRa appear in approximately 15% of cases of mUC. Data on whether FGFRa in mUC have a prognostic impact or predictive benefit for particular treatments have been limited by small sample sizes. The objective of this study was to evaluate the association between tumor FGFRa and clinical outcomes of patients with advanced UC or mUC regardless of therapy type and status. Methods: A convenience sample of oncologists and urologists across the United States provided patient level data on 400 patients with stage IIIb or IV UC via a standardized questionnaire over a 1-month period (August 17, 2020 – September 20, 2020). Study design enriched for FGFRa by requiring physicians to provide ≥1 FGFRa patient record. The questionnaire included physician characteristics, patient demographic information, FGFR status, therapy given, response, and clinical and radiographic measures of progression. Patient records were eligible for inclusion if they were identified and treated during July 1, 2017, to June 30, 2019. Cox proportional hazards models were used to estimate adjusted risk of disease progression by FGFR status. Results: A total of 104 physicians (58.7% medical oncologists, 31.7% hematologic oncologists, and 9.6% urologic oncologists) contributed 414 patient records Overall, 73.9% of the patients were male and the average age was 64.5 years (SD ±10.6). Median follow-up was 15 months. Of the 414 patients, 218 (52.7%) had FGFRa and 196 (47.3%) had FGFR wild-type ( FGFRwt) mUC . Of the 218 patients with FGFRa, 47.2% were treated with front-line chemo, 27.5% with a programmed death-ligand 1 inhibitor (PD-L1), 11.5% with chemo + PD-L1, and 13.8% with other treatments. Of the 196 FGFRwt patients, 63.2% were treated with front-line chemo, 21.9% with PD-L1, 12.2% with chemo + PD-L1, and 2.6% with other treatments. There was no difference in response or progression status for those receiving front-line chemo (HR, 1.15; 95% CI, 0.86-1.55). Among 97 patients (55 FGFRa and 42 FGFRwt) who received PD-L1 alone as front-line therapy, those who had FGFRa had an adjusted risk of progression 2 times higher than their FGFRwt counterparts (HR, 2.12; 95% CI, 1.13-4.00). Conclusions: Patients with FGFRa mUC progressed earlier than FGFRwt patients treated with front-line PDL-1 inhibitors; however, there was no difference in progression in patients treated with chemo based upon FGFR status. This real-world study using a survey design efficiently generated a relatively large FGFRa dataset, mitigating a core limitation of other studies assessing the patient population with FGFRa. Further work is warranted to validate these results and determine the optimal strategy for treating the patient with FGFRa mUC. Gene expression profiling of FGFRa mUC samples from clinical trials will help determine the potential impact of subtype or other features that may associate with benefit from therapy.


2019 ◽  
Vol 166 (E) ◽  
pp. e3-e7 ◽  
Author(s):  
Rosalie Heller ◽  
H Stammers

IntroductionThe 1.5-mile best-effort run is used in the British Army to assess the fitness of all recruits and trained service personnel by means of the physical fitness assessment (PFA). The 1.5-mile run is a basic measure of fitness and slower times have been associated with an increased risk of musculoskeletal injury (MSkI), particularly during this early stage of training. The aim of this study was to establish whether 1.5-mile run times were associated with subsequent MSkIs among female recruits during their 14-week basic training.MethodRetrospective data were analysed from female recruits who had undertaken basic military training between June 2016 and October 2017. This included retrieving the results of their week 1 PFA; recording the type, cause and week of MSkI if they had sustained one; and noting down their outcome from basic training. Run times were statistically analysed in relation to MSkI occurrence of 227 female recruits using binomial logistic regression with an accepted alpha level of p value <0.05.Results1.5-mile run time predicted risk of MSkI (χ2 (1)=12.91, p<0.0005) in female recruits. The mean run time for injury-free recruits was faster than for injured recruits (12 min 13 s compared with 12 min 43 s). Every 10 s increase in run time was associated with an 8.3% increase in risk of injury.ConclusionSlower 1.5-mile best-effort run time, as a surrogate of aerobic fitness, is associated with increased risk of MSkI in female recruits during basic training.


PLoS ONE ◽  
2021 ◽  
Vol 16 (11) ◽  
pp. e0259212
Author(s):  
Joungyoun Kim ◽  
Sang-Jun Shin ◽  
Hee-Taik Kang

Background The triglyceride-glucose (TyG) index is a reliable indicator of insulin resistance. We aimed to investigate the TyG index in relation to cardio-cerebrovascular diseases (CCVDs and mortality. Methods This retrospective study included 114,603 subjects. The TyG index was categorized into four quartiles by sex: Q1, <8.249 and <8.063; Q2, 8.249‒<8.614 and 8.063‒<8.403; Q3, 8.614‒< 8.998 and 8.403‒<8.752; and Q4, ≥8.998 and ≥8.752, in men and women, respectively. To calculate hazard ratios (HRs) and 95% confidence intervals (CIs) for the primary outcomes (CCVDs and all-cause mortality) and secondary outcomes (cardiovascular diseases [CVDs], cerebrovascular diseases [CbVDs], CCVD-related deaths, or all-cause deaths), Cox proportional hazards regression models were adopted. Results Compared to Q1, the HRs (95% CIs) for the primary outcomes of Q2, Q3, and Q4 were 1.062 (0.981‒1.150), 1.110 (1.024−1.204), and 1.151 (1.058−1.252) in men and 1.099 (0.986−1.226), 1.046 (0.938−1.166), and 1.063 (0.954−1.184) in women, respectively, after adjusted for age, smoking status, drinking status, physical activity, body mass index, systolic blood pressure, low-density lipoprotein cholesterol, economic status, and anti-hypertensive medications. Fully adjusted HRs (95% CIs) for CVDs of Q2, Q3, and Q4 were 1.114 (0.969−1.282), 1.185 (1.031−1.363), and 1.232 (1.068−1.422) in men and 1.238 (1.017−1.508), 1.183 (0.971−1.440), and 1.238 (1.018−1.505) in women, respectively. The adjusted HRs (95% CIs) for ischemic CbVDs of Q2, Q3, and Q4 were 1.005 (0.850−1.187), 1.225 (1.041−1.441), and 1.232 (1.039−1.460) in men and 1.040 (0.821−1.316), 1.226 (0.981−1.532), and 1.312 (1.054−1.634) in women, respectively, while the TyG index was negatively associated with hemorrhagic CbVDs in women but not in men. The TyG index was not significantly associated with CCVD-related death or all-cause death in either sex. Conclusions Elevated TyG index was positively associated with the primary outcomes (CCVDs and all-cause mortality) in men and predicted higher risk of CVDs and ischemic CbVDs in both sexes.


OBJECTIVE The challenges of posterior cervical fusions (PCFs) at the cervicothoracic junction (CTJ) are widely known, including the development of adjacent-segment disease by stopping fusions at C7. One solution has been to cross the CTJ (T1/T2) rather than stopping at C7. This approach may have undue consequences, including increased reoperations for symptomatic nonunion (operative nonunion). The authors sought to investigate if there is a difference in operative nonunion in PCFs that stop at C7 versus T1/T2. METHODS A retrospective analysis identified patients from the authors’ spine registry (Kaiser Permanente) who underwent PCFs with caudal fusion levels at C7 and T1/T2. Demographics, diagnoses, operative times, lengths of stay, and reoperations were extracted from the registry. Operative nonunion was adjudicated via chart review. Patients were followed until validated operative nonunion, membership termination, death, or end of study (March 31, 2020). Descriptive statistics and 2-year crude incidence rates and 95% confidence intervals for operative nonunion for PCFs stopping at C7 or T1/T2 were reported. Time-dependent crude and adjusted multivariable Cox proportional hazards models were used to evaluate operative nonunion rates. RESULTS The authors identified 875 patients with PCFs (beginning at C3, C4, C5, or C6) stopping at either C7 (n = 470) or T1/T2 (n = 405) with a mean follow-up time of 4.6 ± 3.3 years and a mean time to operative nonunion of 0.9 ± 0.6 years. There were 17 operative nonunions, and, after adjustment for age at surgery and smoking status, the cumulative incidence rates were similar between constructs stopping at C7 and those that extended to T1/T2 (C7: 1.91% [95% CI 0.88%–3.60%]; T1/T2: 1.98% [95% CI 0.86%–3.85%]). In the crude model and model adjusted for age at surgery and smoking status, no difference in risk for constructs extended to T1/T2 compared to those stopping at C7 was found (adjusted HR 1.09 [95% CI 0.42–2.84], p = 0.86). CONCLUSIONS In one of the largest cohort of patients with PCFs stopping at C7 or T1/T2 with an average follow-up of > 4 years, the authors found no statistically significant difference in reoperation rates for symptomatic nonunion (operative nonunion). This finding shows that there is no added risk of operative nonunion by extending PCFs to T1/T2 or stopping at C7.


Author(s):  
Howard G. Wilshire ◽  
Richard W. Hazlett ◽  
Jane E. Nielson

Since 1900, United States troops have fought in more foreign conflicts than any other nation on Earth. Most Americans supported those actions, believing that they would keep the scourge of war far from our homes. But the strategy seems to have failed—it certainly did not prevent terror attacks against the U.S. mainland. The savage Oklahoma City bombing in 1995 and the 11 September 2001 (9/11) attacks on New York and Washington, D.C. were not the first to inflict war damage in America’s 48 contiguous states, however—nor were they the first warlike actions to harm innocent citizens since the Civil War. Paradoxically, making war abroad has always required practicing warfare in our own back yards. Today’s large, mechanized military training exercises have degraded U.S. soils, water supplies, and wildlife habitats in the same ways that the real wars affected war-torn lands far away. The saddest fact of all is that the deadly components of some weapons in the U.S. arsenal never found use in foreign wars but have attacked U.S. citizens in their own homes and communities. The relatively egalitarian universal service of World War II left a whole generation of Americans with nostalgia and reverence for military service. Many of us, perhaps the majority, might argue that human and environmental sacrifices are the price we must be willing to pay to protect our interests and future security. A current political philosophy proposes that the United States must even start foreign wars to protect Americans and their homes. But Americans are not fully aware of all the past sacrifices—and what we don’t know can hurt us. Even decades-old impacts from military training still degrade land and contaminate air and water, particularly in the arid western states, and will continue to do so far into the future. Exploded and unexploded bombs, mines, and shells (“ordnance,” in military terms) and haphazard disposal sites still litter former training lands in western states. And large portions of the western United States remain playgrounds for war games, subject to large-scale, highly mechanized military operations for maintaining combat readiness and projecting American power abroad.


2019 ◽  
Vol 15 (1) ◽  
pp. 101-108 ◽  
Author(s):  
Guofen Yan ◽  
Jenny I. Shen ◽  
Rubette Harford ◽  
Wei Yu ◽  
Robert Nee ◽  
...  

Background and objectivesIn the United States mortality rates for patients treated with dialysis differ by racial and/or ethnic (racial/ethnic) group. Mortality outcomes for patients undergoing maintenance dialysis in the United States territories may differ from patients in the United States 50 states.Design, setting, participants, & measurementsThis retrospective cohort study of using US Renal Data System data included 1,547,438 adults with no prior transplantation and first dialysis treatment between April 1, 1995 and September 28, 2012. Cox proportional hazards regression was used to calculate hazard ratios (HRs) of death for the territories versus 50 states for each racial/ethnic group using the whole cohort and covariate-matched samples. Covariates included demographics, year of dialysis initiation, cause of kidney failure, comorbid conditions, dialysis modality, and many others.ResultsOf 22,828 patients treated in the territories (American Samoa, Guam, Puerto Rico, Virgin Islands), 321 were white, 666 were black, 20,299 were Hispanic, and 1542 were Asian. Of 1,524,610 patients in the 50 states, 838,736 were white, 444,066 were black, 182,994 were Hispanic, and 58,814 were Asian. The crude mortality rate (deaths per 100 patient-years) was lower for whites in the territories than the 50 states (14 and 29, respectively), similar for blacks (18 and 17, respectively), higher for Hispanics (27 and 16, respectively), and higher for Asians (22 and 15). In matched analyses, greater risks of death remained for Hispanics (HR, 1.65; 95% confidence interval, 1.60 to 1.70; P<0.001) and Asians (HR, 2.01; 95% confidence interval, 1.78 to 2.27; P<0.001) living in the territories versus their matched 50 states counterparts. There were no significant differences in mortality among white or black patients in the territories versus the 50 states.ConclusionsMortality rates for patients undergoing dialysis in the United States territories differ substantially by race/ethnicity compared with the 50 states. After matched analyses for comparable age and risk factors, mortality risk no longer differed for whites or blacks, but remained much greater for territory-dwelling Hispanics and Asians.


2016 ◽  
Vol 20 (1) ◽  
pp. 82-91 ◽  
Author(s):  
Giuseppe Grosso ◽  
Urszula Stepaniak ◽  
Agnieszka Micek ◽  
Denes Stefler ◽  
Martin Bobak ◽  
...  

AbstractObjectiveTo test the association between coffee consumption and risk of all-cause, CVD and cancer death in a European cohort.DesignProspective cohort study. Cox proportional hazards models with adjustment for potential confounders to estimate multivariable hazard ratios (HR) and 95 % CI were used.SettingCzech Republic, Russia and Poland.SubjectsA total of 28561 individuals followed for 6·1 years.ResultsA total of 2121 deaths (43·1 % CVD and 35·7 % cancer mortality) occurred during the follow-up. Consumption of 3–4 cups coffee/d was associated with lower mortality risk in men (HR=0·83; 95 % CI 0·71, 0·99) and women (HR=0·63; 95 % CI 0·47, 0·84), while further intake showed non-significant reduced risk estimates (HR=0·71; 95 % CI 0·49, 1·04 and HR=0·51; 95 % CI 0·24, 1·10 in men and women, respectively). Decreased risk of CVD mortality was also found in men (HR=0·71; 95 % CI 0·54, 0·93) for consumption of 3–4 cups coffee/d. Stratified analysis revealed that consumption of a similar amount of coffee was associated with decreased risk of all-cause (HR=0·61; 95 % CI 0·43, 0·87) and cancer mortality (HR=0·59; 95 % CI 0·35, 0·99) in non-smoking women and decreased risk of all-cause mortality for >4 cups coffee/d in men with no/moderate alcohol intake.ConclusionsCoffee consumption was associated with decreased risk of mortality. The protective effect was even stronger when stratification by smoking status and alcohol intake was performed.


Sign in / Sign up

Export Citation Format

Share Document