scholarly journals The Effectiveness of Neuromuscular Warmups for Lower Extremity Injury Prevention in Basketball: A Systematic Review

2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Anna C. Davis ◽  
Nicholas P. Emptage ◽  
Dana Pounds ◽  
Donna Woo ◽  
Robert Sallis ◽  
...  

Abstract Background Neuromuscular warmups have gained increasing attention as a means of preventing sports-related injuries, but data on effectiveness in basketball are sparse. The objective of this systematic review was to evaluate evidence of the effectiveness of neuromuscular warmup-based strategies for preventing lower extremity injuries among basketball athletes. Methods PubMed and Cochrane Library databases were searched in February 2019. Studies were included if they were English-language randomized controlled, non-randomized comparative, or prospective cohort trials, tested neuromuscular and/or balance-focused warmup interventions among basketball players, and assessed at least one type of lower extremity injury as a primary outcome. Criteria developed by the USPSTF were used to appraise study quality, and GRADE was used to appraise the body of evidence for each outcome. Due to heterogeneity in the included studies, meta-analyses could not be performed. Results In total, 825 titles and abstracts were identified. Of the 13 studies which met inclusion criteria for this review, five were balance interventions (3 randomized controlled trials) and eight were multicomponent interventions involving multiple categories of dynamic neuromuscular warmup (5 randomized controlled trials). Authors of four of the studies were contacted to obtain outcome data specific to basketball athletes. Basketball specific results from the studies suggest significant protective effects for the following lower extremity injuries: ankle injuries (significant in 4 out of the 9 studies that assessed this outcome); ACL injuries (2 of 4 studies); knee injuries generally (1 of 5 studies); and overall lower extremity injuries (5 of 7 studies). All but one of the non-significant results were directionally favorable. Evidence was moderate for the effect of multicomponent interventions on lower extremity injuries generally. For all other outcomes, and for balance-based interventions, the quality of evidence was rated as low. Conclusion Overall, the evidence is supportive of neuromuscular warmups for lower extremity injury prevention among basketball players. However, most studies are underpowered, some used lower-quality research study designs, and outcome and exposure definitions varied. Due to the nature of the study designs, effects could not be attributed to specific intervention components. More research is needed to identify the most effective bundle of warmup activities.

2021 ◽  
Vol 9 (5) ◽  
pp. 232596712110034
Author(s):  
Toufic R. Jildeh ◽  
Fabien Meta ◽  
Jacob Young ◽  
Brendan Page ◽  
Kelechi R. Okoroha

Background: Impaired neuromuscular function after concussion has recently been linked to increased risk of lower extremity injuries in athletes. Purpose: To determine if National Football League (NFL) athletes have an increased risk of sustaining an acute, noncontact lower extremity injury in the 90-day period after return to play (RTP) and whether on-field performance differs pre- and postconcussion. Study Design: Cohort study, Level of evidence, 3. Methods: NFL concussions in offensive players from the 2012-2013 to the 2016-2017 seasons were studied. Age, position, injury location/type, RTP, and athlete factors were noted. A 90-day RTP postconcussive period was analyzed for lower extremity injuries. Concussion and injury data were obtained from publicly available sources. Nonconcussed, offensive skill position NFL athletes from the same period were used as a control cohort, with the 2014 season as the reference season. Power rating performance metrics were calculated for ±1, ±2, and ±3 seasons pre- and postconcussion. Conditional logistic regression was used to determine associations between concussion and lower extremity injury as well as the relationship of concussions to on-field performance. Results: In total, 116 concussions were recorded in 108 NFL athletes during the study period. There was no statistically significant difference in the incidence of an acute, noncontact lower extremity injury between concussed and control athletes (8.5% vs 12.8%; P = .143), which correlates with an odds ratio of 0.573 (95% CI, 0.270-1.217). Days (66.4 ± 81.9 days vs 45.1 ± 69.2 days; P = .423) and games missed (3.67 ± 3.0 vs 2.9 ± 2.7 games; P = .470) were similar in concussed athletes and control athletes after a lower extremity injury. No significant changes in power ratings were noted in concussed athletes in the acute period (±1 season to injury) when comparing pre- and postconcussion. Conclusion: Concussed, NFL offensive athletes did not demonstrate increased odds of acute, noncontact, lower extremity injury in a 90-day RTP period when compared with nonconcussed controls. Immediate on-field performance of skill position players did not appear to be affected by concussion.


Author(s):  
Alhassan Abass ◽  
Lawrence Quaye ◽  
Yussif Adams

Aim: This study aims at determining the upper and lower extremity injury pattern and severity of motorcycle accidents in the Tamale metropolis, Ghana. Methods: A retrospective hospital-based study comprising data on 190 motorcycle accident victims at the Accident and Emergency Centres of three major hospitals (Tamale Teaching Hospital, Central and West Hospitals) in Tamale metropolis from February to April 2018. Demographic data, injury type, injury location, use of crash helmet and injury outcomes were retrieved from the medical records registry. Data was analysed using SPSS version 23.0. Categorical variables were compared using Chi-square test and One-way ANOVA test was done to compare groups. Results: From the 190 victims, 78.9% were treated and discharged, 17.4% were disabled and 3.7% died. Injury mechanism was significantly (F-test = 22.64, p = 0.00) linked with injury outcome. Victims who had frontal impact collision and died (71.4%) were significantly (p<0.05) more than those who were treated and discharged (26.7%). Accident victims with upper extremity injury who became disabled (18.2%) were more (p<0.05) than those treated and discharged (16.7%). Out of the 190 victims involved in motorcycle accidents, 64.2% were not wearing crash helmet. There was significant relationship (p<0.05) between use of crash helmet and injury outcome. More (23.0%) of accident victims disabled were not wearing crash helmet and among those who died, none (0.0%) had a crash helmet on. Conclusion: Upper and lower extremity injuries as well head and neck injuries were high among motorcycle accident victims. The study recommends capacity building for healthcare professionals to manage head, neck, upper and lower extremity injuries at the Accident and Emergency Centres. Regular training programs should be conducted by law enforcement authorities in northern Ghana to train motorcycle riders and educate them on road traffic regulations. Compliance to the use of crash helmet by motorcyclists should strongly be enforced. Further prospective studies are needed to delineate these injury patterns and ascertain reason behind non-usage of crash helmet by motorcyclist in the Metropolis.


2020 ◽  
Vol 48 (9) ◽  
pp. 2287-2294 ◽  
Author(s):  
Christina D. Mack ◽  
Richard W. Kent ◽  
Michael J. Coughlin ◽  
Kristin Y. Shiue ◽  
Leigh J. Weiss ◽  
...  

Background: Lower extremity injuries are the most common injuries in professional sports and carry a high burden to players and teams in the National Football League (NFL). Injury prevention strategies can be refined by a foundational understanding of the occurrence and effect of these injuries on NFL players. Purpose: To determine the incidence of specific lower extremity injuries sustained by NFL players across 4 NFL seasons. Study Design: Descriptive epidemiology study. Methods: This retrospective, observational study included all time-loss lower extremity injuries that occurred during football-related activities during the 2015 through 2018 seasons. Injury data were collected prospectively through a leaguewide electronic health record (EHR) system and linked with NFL game statistics and player participation to calculate injury incidence per season and per 10,000 player-plays for lower extremity injuries overall and for specific injuries. Days lost due to injury were estimated through 2018 for injuries occurring in the 2015 to 2017 seasons. Results: An average of 2006 time-loss lower extremity injuries were reported each season over this 4-year study, representing a 1-season risk of 41% for an NFL player. Incidence was stable from 2015 to 2018, with an estimated total missed time burden each NFL season of approximately 56,700 player-days lost. Most (58.7%) of these injuries occurred during games, with an overall higher rate of injuries observed in preseason compared with regular season (11.5 vs 9.4 injuries per 10,000 player-plays in games). The knee was the most commonly injured lower extremity region (29.3% of lower body injuries), followed by the ankle (22.4%), thigh (17.2%), and foot (9.1%). Hamstring strains were the most common lower extremity injury, followed by lateral ankle sprains, adductor strains, high ankle sprains, and medial collateral ligament tears. Conclusion: Lower extremity injuries affect a high number of NFL players, and the incidence did not decrease over the 4 seasons studied. Prevention and rehabilitation protocols for these injuries should continue to be prioritized.


2019 ◽  
Vol 7 (3_suppl) ◽  
pp. 2325967119S0014
Author(s):  
Marissa Mastrocola ◽  
Amanda McCoy ◽  
Rubello Gleeson

PURPOSE: The purpose of this investigation is to examine the relationship between forced sedentariness attributed to the management of a lower extremity injury and change in BMI in pediatric patients. INTRODUCTION: Over the past 30 years, childhood obesity has significantly increased. Lack of physical activity and rising levels of sedentary living are associated with rising childhood obesity. As the effects of childhood obesity are long lasting, it is important to characterize childhood conditions that lead to increases in BMI. Traumatic lower extremity injuries in children commonly result in casting and forced immobilization. Children and adolescents who cease physical activity at a younger age are at a higher risk for weight gain and inactive lifestyles as adults. Presently, the data examining short term and longitudinal effects of injury on weight changes are lacking. Therefore, the purpose of this investigation is to examine the relationship between forced sedentariness attributed to the management of a lower extremity injury and change in BMI in pediatric patients. METHODS: Eighty eight subjects aged 5 to 18 with lower extremity fractures managed with non-weight bearing for a minimum of 6 weeks were identified. This group was compared to a cohort of controls, who presented to the pediatric orthopaedic clinic for upper extremity injuries. For each subject, BMI was calculated from height and weight data obtained at each clinic visit for which the data were available for a minimum of 6 months. Percent change in BMI was then calculated and compared between the lower extremity injury group and controls utilizing ANOVA. Logistic regression was then performed to determine if age, gender, and undergoing surgery were correleated with a BMI change of greater than 10%. RESULTS: The results of initial ANOVA analysis demonstrated no difference in BMI trends between children with lower extremity injuries and controls. On average, both groups showed a modest increase in BMI over time (lower extremity injuries = 3.40%, controls = 3.23%). Logistic analysis of the lower extremity revealed male gender to be associated with a 10% increase in BMI during the study period, though this was not significant (p = 0.153) CONCLUSIONS: Our investigation yielded a negative result. As there is limited literature on BMI trends in children, our power calculation may have underestimated the subjects needed to demonstrate a statistically significant difference between groups. SIGNIFICANCE: The increases observed in both groups may indicate that the experience of being injured is associated with increases in BMI regardless of the location of injury. Further investigation would involve assessing BMI trends prospectively to identify opportunities to obviate possible weight gain with injury.


2017 ◽  
Vol 52 (11) ◽  
pp. 1028-1034 ◽  
Author(s):  
Robert C. Lynall ◽  
Timothy C. Mauntel ◽  
Ryan T. Pohlig ◽  
Zachary Y. Kerr ◽  
Thomas P. Dompier ◽  
...  

Context:  Although an association between concussion and musculoskeletal injury has been described in collegiate and professional athletes, no researchers have investigated an association in younger athletes. Objective:  To determine if concussion in high school athletes increased the risk for lower extremity musculoskeletal injury after return to activity. Design:  Observational cohort study. Setting:  One hundred ninety-six high schools across 26 states. Patients or Other Participants:  We used data from the National Athletic Treatment, Injury and Outcomes Network surveillance system. Athletic trainers provided information about sport-related concussions and musculoskeletal injuries in athletes in 27 sports, along with missed activity time due to these injuries. Main Outcome Measure(s):  Three general estimating equations were modeled to predict the odds of sustaining (1) any lower extremity injury, (2) a time-loss lower extremity injury, or (3) a non–time-loss lower extremity injury after concussion. Predictors were the total number of previous injuries, number of previous concussions, number of previous lower extremity injuries, number of previous upper extremity injuries, and sport contact classification. Results:  The initial dataset contained data from 18 216 athletes (females = 39%, n = 6887) and 46 217 injuries. Lower extremity injuries accounted for most injuries (56.3%), and concussions for 4.3% of total injuries. For every previous concussion, the odds of sustaining a subsequent time-loss lower extremity injury increased 34% (odds ratio [OR] = 1.34; 95% confidence interval [CI] = 1.13, 1.60). The number of previous concussions had no effect on the odds of sustaining any subsequent lower extremity injury (OR = 0.97; 95% CI = 0.89, 1.05) or a non–time-loss injury (OR = 1.01; 95% CI = 0.92, 1.10). Conclusions:  Among high school athletes, concussion increased the odds of sustaining subsequent time-loss lower extremity injuries but not non–time-loss injuries. By definition, time-loss injuries may be considered more severe than non–time-loss injuries. The exact mechanism underlying the increased risk of lower extremity injury after concussion remains elusive and should be further explored in future research.


Author(s):  
John C. Garner ◽  
Lesley R. Parrish ◽  
Kimberly R. Shaw ◽  
Samuel J. Wilson ◽  
Paul T. Donahue

Background of Study: Females generally have a 6-8 times higher risk for lower extremity injury compared to male counterparts due to biomechanical differences and/or poor landing strategies. In recent years, a great deal of focus has been placed on prevention and reduction of non-contact lower extremity injuries. This has spurred the development of assessment methods to determine how athletes move and tools with which those motions are measured. Efforts have been made to measure and quantify movement strategies, which have given rise to multiple movement tests and measurement devices. One approach is the use of wearable technologies used in conjunction with a movement screening. Objective: Demonstrate a practical approach of using wearable technologies to guide training regimens in a population of female athletes that would be considered at risk for lower extremity injuries. Methods: A cohort of Division I female volleyball athletes were screened using wearable technology then assigned an intervention based on screening results. Comparisons were made between injury rates during the season when the intervention was applied compared to previous seasons. Results: All lower extremity injury rates were reduced after the intervention was applied. Conclusions: The use of wearable technology aids in quantifying movement to then assign a strategic intervention to reduce injuries in an at risk athletic population.


2018 ◽  
Vol 47 (1) ◽  
pp. 189-196 ◽  
Author(s):  
Christina D. Mack ◽  
Elliott B. Hershman ◽  
Robert B. Anderson ◽  
Michael J. Coughlin ◽  
Andrew S. McNitt ◽  
...  

Background: Biomechanical studies have shown that synthetic turf surfaces do not release cleats as readily as natural turf, and it has been hypothesized that concomitant increased loading on the foot contributes to the incidence of lower body injuries. This study evaluates this hypothesis from an epidemiologic perspective, examining whether the lower extremity injury rate in National Football League (NFL) games is greater on contemporary synthetic turfs as compared with natural surfaces. Hypothesis: Incidence of lower body injury is higher on synthetic turf than on natural turf among elite NFL athletes playing on modern-generation surfaces. Study Design: Cohort study; Level of evidence, 3. Methods: Lower extremity injuries reported during 2012-2016 regular season games were included, with all 32 NFL teams reporting injuries under mandated, consistent data collection guidelines. Poisson models were used to construct crude and adjusted incidence rate ratios (IRRs) to estimate the influence of surface type on lower body injury groupings (all lower extremity, knee, ankle/foot) for any injury reported as causing a player to miss football participation as well as injuries resulting in ≥8 days missed. A secondary analysis was performed on noncontact/surface contact injuries. Results: Play on synthetic turf resulted in a 16% increase in lower extremity injuries per play than that on natural turf (IRR, 1.16; 95% CI, 1.10-1.23). This association between synthetic turf and injury remained when injuries were restricted to those that resulted in ≥8 days missed, as well as when categorizations were narrowed to focus on distal injuries anatomically closer to the playing surface (knee, ankle/foot). The higher rate of injury on synthetic turf was notably stronger when injuries were restricted to noncontact/surface contact injuries (IRRs, 1.20-2.03; all statistically significant). Conclusion: These results support the biomechanical mechanism hypothesized and add confidence to the conclusion that synthetic turf surfaces have a causal impact on lower extremity injury.


2019 ◽  
Vol 47 (12) ◽  
pp. 2853-2862 ◽  
Author(s):  
Gary L. Helton ◽  
Kenneth L. Cameron ◽  
Rebecca A. Zifchock ◽  
Erin Miller ◽  
Donald L. Goss ◽  
...  

Background: Running-related overuse injuries are very common among recreational runners, with the reported annual injury rates ranging from 39% to 85%. Relatively few large prospective cohort studies have been conducted to investigate injury risk associated with different running shoe characteristics, and the results of the existing studies are often contradictory. Purpose/Hypothesis: The purpose was to investigate the relationship between running shoe characteristics and lower extremity musculoskeletal injury. It was hypothesized that the risk of injury would be increased in individuals wearing shoes with minimal torsional stiffness and heel height compared with those wearing shoes with greater levels of torsional stiffness and heel height. Study Design: Cohort study; Level of evidence, 2. Methods: The study included 1025 incoming cadets. Shoe torsional stiffness and heel height were calculated and recorded. Demographic data were recorded and analyzed as potential covariates. Lower extremity injuries sustained over 9 weeks during cadet basic training were documented by use of the Armed Forces Health Longitudinal Technology Application and the Cadet Illness and Injury Tracking System. Kaplan-Meier survival curves were estimated, with time to incident lower extremity injury as the primary outcome by level of the independent predictor variables. Risk factors or potential covariates were carried forward into multivariable Cox proportional hazards regression models. Absolute and relative risk reduction and numbers needed to treat were calculated. Results: Approximately 18.1% of participants incurred a lower extremity injury. Cadets wearing shoes with moderate lateral torsional stiffness were 49% less likely to incur any type of lower extremity injury and 52% less likely to incur an overuse lower extremity injury than cadets wearing shoes with minimal lateral torsional stiffness, both of which were statistically significant observations. Injury risk was similar among cadets wearing shoes with minimal and extreme lateral torsional stiffness. Conclusion: Shoes with mild to moderate lateral torsional stiffness may be appropriate in reducing risk of lower extremity injury in cadets. Shoes with minimal lateral torsional stiffness should be discouraged in this population.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Jodie Makara ◽  
Sijun Shen ◽  
Ann Nwosu ◽  
William Arnold ◽  
Gary Smith ◽  
...  

Abstract Background Extremity injury is one of the most common injury types for bicyclists. Extremity injury can lead to long-term disability and contribute to adverse health-related quality of life and prolonged absence from work. Objectives The objectives of our study were to identify crash factors associated with bicyclist upper and lower extremity injury and characterize type of extremity injury by bicyclist age category. Methods We linked the 2013–2017 Ohio police accident report and hospital databases. The logistic regression model was used to model the odds of sustaining upper or lower extremity injury among bicyclists involved in bicycle-vehicle crashes. Bicyclist upper and lower extremity injury were further described by the detailed injured body regions (e.g., forearm and elbow or lower leg) and the nature of injury (e.g., superficial or fracture). Results Bicyclists 65 years or older had higher odds (odds ratio [OR] = 1.46, 95% confidence interval [CI]: 1.03–2.08) of sustaining upper extremity injury, bicyclists aged 3–14 years (OR = 1.34, 95% CI: 1.09–1.66) and 15–24 years (OR = 1.24, 95% CI: 1.03–1.49) had higher odds of sustaining lower extremity injury, compared to bicyclists 25–44 years old. In addition, colder weather, bicyclist sex, and intersection-related crashes were associated with bicyclists’ odds of sustaining upper or lower extremity injury. Compared to individuals under 65 years old, bicyclists 65 years or older had a higher percentage of injury to the wrist, hand and finger, or knee. Bicyclists aged 65 years or older also had a higher percentage of fractures. Conclusions Our study has identified important factors that were associated with bicyclists’ odds of sustaining an extremity injury. Based on these findings, targeted educational efforts and interventions can be implemented to prevent bicyclists from these injuries.


2016 ◽  
Vol 51 (11) ◽  
pp. 905-918 ◽  
Author(s):  
Scott D. Carow ◽  
Eric M. Haniuk ◽  
Kenneth L. Cameron ◽  
Darin A. Padua ◽  
Stephen W. Marshall ◽  
...  

Context: Specific movement patterns have been identified as possible risk factors for noncontact lower extremity injuries. The Dynamic Integrated Movement Enhancement (DIME) was developed to modify these movement patterns to decrease injury risk. Objective: To determine if the DIME is effective for preventing lower extremity injuries in US Military Academy (USMA) cadets. Design: Cluster-randomized controlled trial. Setting: Cadet Basic Training at USMA. Patients or Other Participants: Participants were 1313 cadets (1070 men, 243 women). Intervention(s): Participants were cluster randomized to 3 groups. The active warm-up (AWU) group performed standard Army warm-up exercises. The DIME groups were assigned to a DIME cadre-supervised (DCS) group or a DIME expert-supervised (DES) group; the former consisted of cadet supervision and the latter combined cadet and health professional supervision. Groups performed exercises 3 times weekly for 6 weeks. Main Outcome Measure(s): Cumulative risk of lower extremity injury was the primary outcome. We gathered data during Cadet Basic Training and for 9 months during the subsequent academic year. Risk ratios and 95% confidence intervals (CIs) were calculated to compare groups. Results: No differences were seen between the AWU and the combined DIME (DCS and DES) groups during Cadet Basic Training or the academic year. During the academic year, lower extremity injury risk in the DES group decreased 41% (relative risk [RR] = 0.59; 95% CI = 0.38, 0.93; P = .02) compared with the DCS group; a nonsignificant 25% (RR = 0.75; 95% CI = 0.49, 1.14; P = .18) decrease occurred in the DES group compared with the AWU group. Finally, there was a nonsignificant 27% (RR = 1.27; 95% CI = 0.90, 1.78; P = .17) increase in injury risk during the academic year in the DCS group compared with the AWU group. Conclusions: We observed no differences in lower extremity injury risk between the AWU and combined DIME groups. However, the magnitude and direction of the risk ratios in the DES group compared with the AWU group, although not statistically significant, indicate that professional supervision may be a factor in the success of injury-prevention programs.


Sign in / Sign up

Export Citation Format

Share Document