scholarly journals The Impact of the Helmet-Lowering Rule on Regular Season NFL Injuries

2020 ◽  
Vol 8 (7_suppl6) ◽  
pp. 2325967120S0040
Author(s):  
Erik Stapleton ◽  
Randy Cohn ◽  
Colin Burgess

Objectives: The National Football League (NFL) has been under growing scrutiny from the public due to the apparent rise in concussions and head injuries and the subsequent deleterious effects. In efforts to address these concerns, the NFL implemented a new “Helmet-lowering” rule prior to the 2018-2019 season. This rule is defined as “a foul if a player lowers his head to initiate and make contact with his helmet against an opponent.” The purpose of this paper was to compare incidence of injuries in NFL players prior to and after implementation of this new rule. Methods: NFL injury data was retrospectively reviewed from public league records for all players in regular season games played from the 2017 and 2018 NFL seasons. An injury was defined as any player listed on a team’s injury report that was not previously documented on the team’s report one week preceding the index injury. Injury rates were reported as the number of injuries per 1000 athletic exposures (AE’s). Athletic exposures were defined as equal to the sum of the total number of NFL regular-season games played. Relative risk (with 95% CI) was calculated by using the number of injuries per 1000 athletic exposures for the season before and after the new rule implementation. Risk reduction was then calculated for the overall injuries, upper/lower extremity and head injuries. Results: Over the 2 seasons there were a total of 2,774 injuries identified. After rule implementation at the beginning of the 2018 season, there was an overall relative risk (RR) of 0.91 for injury (95% CI 0.88 to 0.95, p<0.0001), with an injury risk reduction of 8.73%. Upper extremity injuries had a RR of 0.76 (95% CI 0.65 to 0.87, p=0.0005) and a risk reduction of 24.10%. Lower extremity injuries had a RR of 0.91 (95% CI 0.87 to 0.96, p=0.0005) with a risk reduction of 8.63%. In concussions and head injuries there was an overall RR of 0.55 for injury (95% CI 0.44 to 0.69, p<0.0001), with an injury risk reduction of 45.10%. Wide receivers and linebackers were most commonly injured players on offense and defense, respectively. Conclusion: Implementation of the new Helmet-Lowering rule seems to have played a role in significantly decreasing the NFL athlete’s risk of injury across all measures, most notably in concussion and head injuries.

2021 ◽  
Vol 9 (5) ◽  
pp. 232596712110034
Author(s):  
Toufic R. Jildeh ◽  
Fabien Meta ◽  
Jacob Young ◽  
Brendan Page ◽  
Kelechi R. Okoroha

Background: Impaired neuromuscular function after concussion has recently been linked to increased risk of lower extremity injuries in athletes. Purpose: To determine if National Football League (NFL) athletes have an increased risk of sustaining an acute, noncontact lower extremity injury in the 90-day period after return to play (RTP) and whether on-field performance differs pre- and postconcussion. Study Design: Cohort study, Level of evidence, 3. Methods: NFL concussions in offensive players from the 2012-2013 to the 2016-2017 seasons were studied. Age, position, injury location/type, RTP, and athlete factors were noted. A 90-day RTP postconcussive period was analyzed for lower extremity injuries. Concussion and injury data were obtained from publicly available sources. Nonconcussed, offensive skill position NFL athletes from the same period were used as a control cohort, with the 2014 season as the reference season. Power rating performance metrics were calculated for ±1, ±2, and ±3 seasons pre- and postconcussion. Conditional logistic regression was used to determine associations between concussion and lower extremity injury as well as the relationship of concussions to on-field performance. Results: In total, 116 concussions were recorded in 108 NFL athletes during the study period. There was no statistically significant difference in the incidence of an acute, noncontact lower extremity injury between concussed and control athletes (8.5% vs 12.8%; P = .143), which correlates with an odds ratio of 0.573 (95% CI, 0.270-1.217). Days (66.4 ± 81.9 days vs 45.1 ± 69.2 days; P = .423) and games missed (3.67 ± 3.0 vs 2.9 ± 2.7 games; P = .470) were similar in concussed athletes and control athletes after a lower extremity injury. No significant changes in power ratings were noted in concussed athletes in the acute period (±1 season to injury) when comparing pre- and postconcussion. Conclusion: Concussed, NFL offensive athletes did not demonstrate increased odds of acute, noncontact, lower extremity injury in a 90-day RTP period when compared with nonconcussed controls. Immediate on-field performance of skill position players did not appear to be affected by concussion.


Author(s):  
Richard N Puzzitiello ◽  
Coleen F Rizzo ◽  
Kirsten D Garvey ◽  
Elizabeth G Matzkin ◽  
Matthew J Salzler

Year-round intensive, single-sport training beginning at an young age is an increasingly common trend in the youth athlete population. Early sport specialisation may be ineffective for long-term athletic success and contribute to an increased risk of physical injury and burn-out. The medical community has noted that repetitive movement patterns may occur in non-diversified activity and this may contribute to overuse injury in young athletes. Studies have begun to identify an association between early sport specialisation and lower extremity injuries in the youth athlete population that is independent of training volume. Recent literature has suggested that sport diversification, not specialisation, is a better path for athletic success and minimised lower extremity injury risk.


2020 ◽  
Vol 48 (9) ◽  
pp. 2287-2294 ◽  
Author(s):  
Christina D. Mack ◽  
Richard W. Kent ◽  
Michael J. Coughlin ◽  
Kristin Y. Shiue ◽  
Leigh J. Weiss ◽  
...  

Background: Lower extremity injuries are the most common injuries in professional sports and carry a high burden to players and teams in the National Football League (NFL). Injury prevention strategies can be refined by a foundational understanding of the occurrence and effect of these injuries on NFL players. Purpose: To determine the incidence of specific lower extremity injuries sustained by NFL players across 4 NFL seasons. Study Design: Descriptive epidemiology study. Methods: This retrospective, observational study included all time-loss lower extremity injuries that occurred during football-related activities during the 2015 through 2018 seasons. Injury data were collected prospectively through a leaguewide electronic health record (EHR) system and linked with NFL game statistics and player participation to calculate injury incidence per season and per 10,000 player-plays for lower extremity injuries overall and for specific injuries. Days lost due to injury were estimated through 2018 for injuries occurring in the 2015 to 2017 seasons. Results: An average of 2006 time-loss lower extremity injuries were reported each season over this 4-year study, representing a 1-season risk of 41% for an NFL player. Incidence was stable from 2015 to 2018, with an estimated total missed time burden each NFL season of approximately 56,700 player-days lost. Most (58.7%) of these injuries occurred during games, with an overall higher rate of injuries observed in preseason compared with regular season (11.5 vs 9.4 injuries per 10,000 player-plays in games). The knee was the most commonly injured lower extremity region (29.3% of lower body injuries), followed by the ankle (22.4%), thigh (17.2%), and foot (9.1%). Hamstring strains were the most common lower extremity injury, followed by lateral ankle sprains, adductor strains, high ankle sprains, and medial collateral ligament tears. Conclusion: Lower extremity injuries affect a high number of NFL players, and the incidence did not decrease over the 4 seasons studied. Prevention and rehabilitation protocols for these injuries should continue to be prioritized.


2018 ◽  
Vol 4 (1) ◽  
pp. e000311 ◽  
Author(s):  
Anu M Räisänen ◽  
Kati Pasanen ◽  
Tron Krosshaug ◽  
Tommi Vasankari ◽  
Pekka Kannus ◽  
...  

Background/aimPoor frontal plane knee control can manifest as increased dynamic knee valgus during athletic tasks. The purpose of this study was to investigate the association between frontal plane knee control and the risk of acute lower extremity injuries. In addition, we wanted to study if the single-leg squat (SLS) test can be used as a screening tool to identify athletes with an increased injury risk.MethodsA total of 306 basketball and floorball players participated in the baseline SLS test and a 12-month injury registration follow-up. Acute lower extremity time-loss injuries were registered. Frontal plane knee projection angles (FPKPA) during the SLS were calculated using a two-dimensional video analysis.ResultsAthletes displaying a high FPKPA were 2.7 times more likely to sustain a lower extremity injury (adjusted OR 2.67, 95% CI 1.23 to 5.83) and 2.4 times more likely to sustain an ankle injury (OR 2.37, 95% CI 1.13 to 4.98). There was no statistically significant association between FPKPA and knee injury (OR 1.49, 95% CI 0.56 to 3.98). The receiver operating characteristic curve analyses indicated poor combined sensitivity and specificity when FPKPA was used as a screening test for lower extremity injuries (area under the curve of 0.59) and ankle injuries (area under the curve of 0.58).ConclusionsAthletes displaying a large FPKPA in the SLS test had an elevated risk of acute lower extremity and ankle injuries. However, the SLS test is not sensitive and specific enough to be used as a screening tool for future injury risk.


2018 ◽  
Vol 47 (1) ◽  
pp. 189-196 ◽  
Author(s):  
Christina D. Mack ◽  
Elliott B. Hershman ◽  
Robert B. Anderson ◽  
Michael J. Coughlin ◽  
Andrew S. McNitt ◽  
...  

Background: Biomechanical studies have shown that synthetic turf surfaces do not release cleats as readily as natural turf, and it has been hypothesized that concomitant increased loading on the foot contributes to the incidence of lower body injuries. This study evaluates this hypothesis from an epidemiologic perspective, examining whether the lower extremity injury rate in National Football League (NFL) games is greater on contemporary synthetic turfs as compared with natural surfaces. Hypothesis: Incidence of lower body injury is higher on synthetic turf than on natural turf among elite NFL athletes playing on modern-generation surfaces. Study Design: Cohort study; Level of evidence, 3. Methods: Lower extremity injuries reported during 2012-2016 regular season games were included, with all 32 NFL teams reporting injuries under mandated, consistent data collection guidelines. Poisson models were used to construct crude and adjusted incidence rate ratios (IRRs) to estimate the influence of surface type on lower body injury groupings (all lower extremity, knee, ankle/foot) for any injury reported as causing a player to miss football participation as well as injuries resulting in ≥8 days missed. A secondary analysis was performed on noncontact/surface contact injuries. Results: Play on synthetic turf resulted in a 16% increase in lower extremity injuries per play than that on natural turf (IRR, 1.16; 95% CI, 1.10-1.23). This association between synthetic turf and injury remained when injuries were restricted to those that resulted in ≥8 days missed, as well as when categorizations were narrowed to focus on distal injuries anatomically closer to the playing surface (knee, ankle/foot). The higher rate of injury on synthetic turf was notably stronger when injuries were restricted to noncontact/surface contact injuries (IRRs, 1.20-2.03; all statistically significant). Conclusion: These results support the biomechanical mechanism hypothesized and add confidence to the conclusion that synthetic turf surfaces have a causal impact on lower extremity injury.


2019 ◽  
Vol 47 (12) ◽  
pp. 2853-2862 ◽  
Author(s):  
Gary L. Helton ◽  
Kenneth L. Cameron ◽  
Rebecca A. Zifchock ◽  
Erin Miller ◽  
Donald L. Goss ◽  
...  

Background: Running-related overuse injuries are very common among recreational runners, with the reported annual injury rates ranging from 39% to 85%. Relatively few large prospective cohort studies have been conducted to investigate injury risk associated with different running shoe characteristics, and the results of the existing studies are often contradictory. Purpose/Hypothesis: The purpose was to investigate the relationship between running shoe characteristics and lower extremity musculoskeletal injury. It was hypothesized that the risk of injury would be increased in individuals wearing shoes with minimal torsional stiffness and heel height compared with those wearing shoes with greater levels of torsional stiffness and heel height. Study Design: Cohort study; Level of evidence, 2. Methods: The study included 1025 incoming cadets. Shoe torsional stiffness and heel height were calculated and recorded. Demographic data were recorded and analyzed as potential covariates. Lower extremity injuries sustained over 9 weeks during cadet basic training were documented by use of the Armed Forces Health Longitudinal Technology Application and the Cadet Illness and Injury Tracking System. Kaplan-Meier survival curves were estimated, with time to incident lower extremity injury as the primary outcome by level of the independent predictor variables. Risk factors or potential covariates were carried forward into multivariable Cox proportional hazards regression models. Absolute and relative risk reduction and numbers needed to treat were calculated. Results: Approximately 18.1% of participants incurred a lower extremity injury. Cadets wearing shoes with moderate lateral torsional stiffness were 49% less likely to incur any type of lower extremity injury and 52% less likely to incur an overuse lower extremity injury than cadets wearing shoes with minimal lateral torsional stiffness, both of which were statistically significant observations. Injury risk was similar among cadets wearing shoes with minimal and extreme lateral torsional stiffness. Conclusion: Shoes with mild to moderate lateral torsional stiffness may be appropriate in reducing risk of lower extremity injury in cadets. Shoes with minimal lateral torsional stiffness should be discouraged in this population.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Corrine Munoz-Plaza ◽  
Dana Pounds ◽  
Anna Davis ◽  
Stacy Park ◽  
Robert Sallis ◽  
...  

Abstract Background While participation in sports-related activities results in improved health outcomes, high school athletes are at risk for lower extremity injuries, especially ankle, knee, and thigh injuries. Efforts to promote the adoption and implementation of evidence-driven approaches to reduce injury risk among school-aged athletes are needed. However, there is limited research regarding the perceived barriers, facilitators, and adherence factors that may influence the successful implementation of effective warm-up routines among this population. Methods We conducted a qualitative study using semi-structured interviews and focus groups to assess high school basketball coach and player current practices, knowledge, and perspectives about warm-ups and lower-extremity injuries (LEIs). We interviewed coaches (n = 12) and players (n = 30) from May to October 2019. Participants were recruited from public high schools in a joint school district in Southern California. Multiple coders employed thematic analysis of the data using validated methods. Results Coaches and players reported regular engagement (e.g., daily) in warm-up routines, but the time dedicated (range 5–45 min), types of exercises, and order varied substantially. Players often co-lead the warm-up practice with the coach or assistant coach. Despite regular engagement in warm-up, players and coaches report multiple challenges, including (1) limited time and space to warm-up effectively at games, (2) a perception that young players are not prone to injury, (3) competing demands for coaches’ time during practice, and (4) coaches’ lack of knowledge. Coaches and players perceive that warming up before practice will result in fewer injuries, and many players are motivated to warm up as a result of their personal injury experience; however, they desire guidance on the ideal exercises for preventing injury and training on the proper form for each exercise. Conclusions Regular involvement in basketball warm-up routines is common among high school teams, but the methods and time dedicated to these practices varied. Players and coaches are eager for more information on warm-up programs shown to reduce LEIs.


2016 ◽  
Vol 51 (11) ◽  
pp. 905-918 ◽  
Author(s):  
Scott D. Carow ◽  
Eric M. Haniuk ◽  
Kenneth L. Cameron ◽  
Darin A. Padua ◽  
Stephen W. Marshall ◽  
...  

Context: Specific movement patterns have been identified as possible risk factors for noncontact lower extremity injuries. The Dynamic Integrated Movement Enhancement (DIME) was developed to modify these movement patterns to decrease injury risk. Objective: To determine if the DIME is effective for preventing lower extremity injuries in US Military Academy (USMA) cadets. Design: Cluster-randomized controlled trial. Setting: Cadet Basic Training at USMA. Patients or Other Participants: Participants were 1313 cadets (1070 men, 243 women). Intervention(s): Participants were cluster randomized to 3 groups. The active warm-up (AWU) group performed standard Army warm-up exercises. The DIME groups were assigned to a DIME cadre-supervised (DCS) group or a DIME expert-supervised (DES) group; the former consisted of cadet supervision and the latter combined cadet and health professional supervision. Groups performed exercises 3 times weekly for 6 weeks. Main Outcome Measure(s): Cumulative risk of lower extremity injury was the primary outcome. We gathered data during Cadet Basic Training and for 9 months during the subsequent academic year. Risk ratios and 95% confidence intervals (CIs) were calculated to compare groups. Results: No differences were seen between the AWU and the combined DIME (DCS and DES) groups during Cadet Basic Training or the academic year. During the academic year, lower extremity injury risk in the DES group decreased 41% (relative risk [RR] = 0.59; 95% CI = 0.38, 0.93; P = .02) compared with the DCS group; a nonsignificant 25% (RR = 0.75; 95% CI = 0.49, 1.14; P = .18) decrease occurred in the DES group compared with the AWU group. Finally, there was a nonsignificant 27% (RR = 1.27; 95% CI = 0.90, 1.78; P = .17) increase in injury risk during the academic year in the DCS group compared with the AWU group. Conclusions: We observed no differences in lower extremity injury risk between the AWU and combined DIME groups. However, the magnitude and direction of the risk ratios in the DES group compared with the AWU group, although not statistically significant, indicate that professional supervision may be a factor in the success of injury-prevention programs.


Sign in / Sign up

Export Citation Format

Share Document