Epidemiological Patterns of Initial and Subsequent Injuries in Collegiate Football Athletes

2017 ◽  
Vol 45 (5) ◽  
pp. 1171-1178 ◽  
Author(s):  
Jacob Z. Williams ◽  
Bhavna Singichetti ◽  
Hongmei Li ◽  
Henry Xiang ◽  
Kevin E. Klingele ◽  
...  

Background: A body of epidemiological studies has examined football injuries and associated risk factors among collegiate athletes. However, few existing studies specifically analyzed injury risk in terms of initial or subsequent injuries. Purpose: To determine athlete-exposures (AEs) and rates of initial and subsequent injury among collegiate football athletes. Study Design: Descriptive epidemiological study. Methods: Injury and exposure data collected from collegiate football players from two Division I universities (2007-2011) were analyzed. Rate of initial injury was calculated as the number of initial injuries divided by the total number of AEs for initial injuries, while the rate for subsequent injury was calculated as the number of subsequent injuries divided by the total number of AEs for subsequent injury. Poisson regression was used to determine injury rate ratio (subsequent vs initial injury), with adjustment for other covariates. Results: The total AEs during the study period were 67,564, resulting in an overall injury rate of 35.2 per 10,000 AEs. Rates for initial and subsequent injuries were 31.7 and 45.3 per 10,000 AEs, respectively, with a rate ratio (RR) of 1.4 for rate of subsequent injury vs rate of initial injury (95% CI, 1.1-1.9). Rate of injury appeared to increase with each successive injury. RR during games was 1.8 (95% CI, 1.1-3.0). The rate of subsequent injuries to the head, neck, and face was 10.9 per 10,000 AEs, nearly double the rate of initial injuries to the same sites (RR = 2.0; 95% CI, 1.1-3.5). For wide receivers, the rate of subsequent injuries was 2.2 times the rate of initial injuries (95% CI, 1.3-3.8), and for defensive linemen, the rate of subsequent injuries was 2.1 times the rate of initial injuries (95% CI, 1.1-3.9). Conclusion: The method used in this study allows for a more accurate determination of injury risk among football players who have already been injured at least once. Further research is warranted to better identify which specific factors contribute to this increased risk for subsequent injury.

2021 ◽  
pp. bjsports-2020-103555
Author(s):  
Francesco Della Villa ◽  
Martin Hägglund ◽  
Stefano Della Villa ◽  
Jan Ekstrand ◽  
Markus Waldén

BackgroundStudies on subsequent anterior cruciate ligament (ACL) ruptures and career length in male professional football players after ACL reconstruction (ACLR) are scarce.AimTo investigate the second ACL injury rate, potential predictors of second ACL injury and the career length after ACLR.Study designProspective cohort study.SettingMen’s professional football.Methods118 players with index ACL injury were tracked longitudinally for subsequent ACL injury and career length over 16.9 years. Multivariable Cox regression analysis with HR was carried out to study potential predictors for subsequent ACL injury.ResultsMedian follow-up was 4.3 (IQR 4.6) years after ACLR. The second ACL injury rate after return to training (RTT) was 17.8% (n=21), with 9.3% (n=11) to the ipsilateral and 8.5% (n=10) to the contralateral knee. Significant predictors for second ACL injury were a non-contact index ACL injury (HR 7.16, 95% CI 1.63 to 31.22) and an isolated index ACL injury (HR 2.73, 95% CI 1.06 to 7.07). In total, 11 of 26 players (42%) with a non-contact isolated index ACL injury suffered a second ACL injury. RTT time was not an independent predictor of second ACL injury, even though there was a tendency for a risk reduction with longer time to RTT. Median career length after ACLR was 4.1 (IQR 4.0) years and 60% of players were still playing at preinjury level 5 years after ACLR.ConclusionsAlmost one out of five top-level professional male football players sustained a second ACL injury following ACLR and return to football, with a considerably increased risk for players with a non-contact or isolated index injury.


2017 ◽  
Vol 12 (3) ◽  
pp. 393-401 ◽  
Author(s):  
Shane Malone ◽  
Mark Roe ◽  
Dominic A. Doran ◽  
Tim J. Gabbett ◽  
Kieran D. Collins

Purpose:To examine the association between combined session rating of perceived exertion (RPE) workload measures and injury risk in elite Gaelic footballers.Methods:Thirty-seven elite Gaelic footballers (mean ± SD age 24.2 ± 2.9 y) from 1 elite squad were involved in a single-season study. Weekly workload (session RPE multiplied by duration) and all time-loss injuries (including subsequent-wk injuries) were recorded during the period. Rolling weekly sums and wk-to-wk changes in workload were measured, enabling the calculation of the acute:chronic workload ratio by dividing acute workload (ie, 1-weekly workload) by chronic workload (ie, rolling-average 4-weekly workload). Workload measures were then modeled against data for all injuries sustained using a logistic-regression model. Odds ratios (ORs) were reported against a reference group.Results:High 1-weekly workloads (≥2770 arbitrary units [AU], OR = 1.63–6.75) were associated with significantly higher risk of injury than in a low-training-load reference group (<1250 AU). When exposed to spikes in workload (acute:chronic workload ratio >1.5), players with 1 y experience had a higher risk of injury (OR = 2.22) and players with 2–3 (OR = 0.20) and 4–6 y (OR = 0.24) of experience had a lower risk of injury. Players with poorer aerobic fitness (estimated from a 1-km time trial) had a higher injury risk than those with higher aerobic fitness (OR = 1.50–2.50). An acute:chronic workload ratio of (≥2.0) demonstrated the greatest risk of injury.Conclusions:These findings highlight an increased risk of injury for elite Gaelic football players with high (>2.0) acute:chronic workload ratios and high weekly workloads. A high aerobic capacity and playing experience appears to offer injury protection against rapid changes in workload and high acute:chronic workload ratios. Moderate workloads, coupled with moderate to high changes in the acute:chronic workload ratio, appear to be protective for Gaelic football players.


2011 ◽  
Vol 46 (6) ◽  
pp. 648-654 ◽  
Author(s):  
Ellen Shanley ◽  
Mitchell J. Rauh ◽  
Lori A. Michener ◽  
Todd S. Ellenbecker

Context: Participation in high school sports has grown 16.1% over the last decade, but few studies have compared the overall injury risks in girls' softball and boys' baseball. Objective: To examine the incidence of injury in high school softball and baseball players. Design: Cohort study. Setting: Greenville, South Carolina, high schools. Patients or Other Participants: Softball and baseball players (n = 247) from 11 high schools. Main Outcome Measure(s): Injury rates, locations, types; initial or subsequent injury; practice or game setting; positions played; seasonal trends. Results: The overall incidence injury rate was 4.5/1000 athlete-exposures (AEs), with more injuries overall in softball players (5.6/1000 AEs) than in baseball players (4.0/1000 AEs). Baseball players had a higher initial injury rate (75.9/1000 AEs) than softball players (66.4/1000 AEs): rate ratio (RR) = 0.88, 95% confidence interval (CI) = 0.4, 1.7. The initial injury rate was higher than the subsequent injury rate for the overall sample (P &lt; .0001) and for softball (P &lt; .0001) and baseball (P &lt; .001) players. For both sports, the injury rate during games (4.6/1000 AEs) was similar to that during practices (4.1/1000 AEs), RR = 1.22, 95% CI = 0.7, 2.2. Softball players were more likely to be injured in a game than were baseball players (RR = 1.92, 95% CI = 0.8, 4.3). Most injuries (77%) were mild (3.5/1000 AEs). The upper extremity accounted for the highest proportion of injuries (63.3%). The incidence of injury for pitchers was 37.3% and for position players was 15.3%. The rate of injury was highest during the first month of the season (7.96/1000 AEs). Conclusions: The incidence of injury was low for both softball and baseball. Most injuries were minor and affected the upper extremity. The injury rates were highest in the first month of the season, so prevention strategies should be focused on minimizing injuries and monitoring players early in the season.


2012 ◽  
Vol 47 (3) ◽  
pp. 264-272 ◽  
Author(s):  
Gary B. Wilkerson ◽  
Jessica L. Giles ◽  
Dustin K. Seibel

Context: Poor core stability is believed to increase vulnerability to uncontrolled joint displacements throughout the kinetic chain between the foot and the lumbar spine. Objective: To assess the value of preparticipation measurements as predictors of core or lower extremity strains or sprains in collegiate football players. Design: Cohort study. Setting: National Collegiate Athletic Association Division I Football Championship Subdivision football program. Patients or Other Participants: All team members who were present for a mandatory physical examination on the day before preseason practice sessions began (n  =  83). Main Outcome Measure(s): Preparticipation administration of surveys to assess low back, knee, and ankle function; documentation of knee and ankle injury history; determination of body mass index; 4 different assessments of core muscle endurance; and measurement of step-test recovery heart rate. All injuries were documented throughout the preseason practice period and 11-game season. Receiver operating characteristic analysis and logistic regression analysis were used to identify dichotomized predictive factors that best discriminated injured from uninjured status. The 75th and 50th percentiles were evaluated as alternative cutpoints for dichotomization of injury predictors. Results: Players with ≥2 of 3 potentially modifiable risk factors related to core function had 2 times greater risk for injury than those with &lt;2 factors (95% confidence interval  =  1.27, 4.22), and adding a high level of exposure to game conditions increased the injury risk to 3 times greater (95% confidence interval  =  1.95, 4.98). Prediction models that used the 75th and 50th percentile cutpoints yielded results that were very similar to those for the model that used receiver operating characteristic-derived cutpoints. Conclusions: Low back dysfunction and suboptimal endurance of the core musculature appear to be important modifiable football injury risk factors that can be identified on preparticipation screening. These predictors need to be assessed in a prospective manner with a larger sample of collegiate football players.


2019 ◽  
Vol 11 (4) ◽  
pp. 332-342 ◽  
Author(s):  
Benjamin L. Brett ◽  
Daniel L. Huber ◽  
Alexa Wild ◽  
Lindsay D. Nelson ◽  
Michael A. McCrea

Background: Although some studies have observed a relationship between age of first exposure (AFE) to American football and long-term outcomes, recent findings in collegiate athletes did not observe a relationship between AFE and more intermediate outcomes at early adulthood. This, however, requires independent replication. Hypothesis: There will be no association between AFE to football and behavioral, cognitive, emotional/psychological, and physical functioning in high school and collegiate athletes. Study Design: Cross-sectional study. Level of Evidence: Level 3. Methods: Active high school and collegiate football players (N = 1802) underwent a comprehensive preseason evaluation on several clinical outcome measures. Demographic and health variables that significantly differed across AFE groups were identified as potential covariates. General linear models (GLMs) with AFE as the independent variable were performed for each clinical outcome variable. Similar GLMs that included identified covariates, with AFE as the predictor, were subsequently performed for each clinical outcome variable. Results: After controlling for covariates of age, concussion history, race, and a diagnosis of ADHD, earlier AFE (<12 vs ≥12 years) did not significantly predict poorer performance on any clinical outcome measures (all P > 0.05). A single statistically significant association between AFE group and somatization score was recorded, with AFE <12 years exhibiting lower levels of somatization. Conclusion: In a large cohort of active high school and collegiate football student-athletes, AFE before the age of 12 years was not associated with worse behavioral, cognitive, psychological, and physical (oculomotor functioning and postural stability) outcomes. Clinical Relevance The current findings suggest that timing of onset of football exposure does not result in poorer functioning in adolescence and young adults and may contribute to resilience through decreased levels of physically related psychological distress.


2019 ◽  
Vol 54 (7) ◽  
pp. 780-786
Author(s):  
Katherine M. Lee ◽  
Melissa C. Kay ◽  
Kristen L. Kucera ◽  
William E. Prentice ◽  
Zachary Y. Kerr

Context Cervical muscle strains are an often-overlooked injury, with neck- and spine-related research typically focusing on spinal cord and vertebral injuries. Objective To examine the rates and distributions of cervical muscle strains in collegiate and high school football athletes. Design Descriptive epidemiology study. Setting Collegiate and high school football teams. Patients or Other Participants The National Collegiate Athletic Association Injury Surveillance Program (NCAA-ISP) collected data from collegiate football athletes. The High School National Athletic Treatment, Injury and Outcomes Network (HS NATION) and High School Reporting Information Online (HS RIO) collected data from high school football athletes. Data from the 2011–2012 through 2013–2014 academic years were used. Main Outcome Measure(s) Athletic trainers collected injury and exposure data for football players. Injury counts, injury rates per 10 000 athlete-exposures (AEs), and injury rate ratios with 95% confidence intervals (CIs) were calculated. Results The NCAA-ISP reported 49 cervical muscle strains (rate = 0.96/10 000 AEs), of which 28 (57.1%) were TL (time loss; rate = 0.55/10 000 AEs). High School NATION reported 184 cervical muscle strains (rate = 1.66/10 000 AEs), of which 33 (17.9%) were TL injuries (rate = 0.30/10 000 AEs). The HS RIO, which collects only TL injuries, reported 120 TL cervical muscle strains (rate = 0.51/10 000 AEs). The overall injury rate was lower in the NCAA-ISP than in HS NATION (injury rate ratio = 0.58; 95% CI = 0.42, 0.79); when restricted to TL injuries, the overall injury rate was higher in the NCAA-ISP (injury rate ratio = 1.83; 95% CI = 1.11, 3.03). No differences were found when comparing TL injuries in HS RIO and the NCAA-ISP. Cervical muscle-strain rates were higher during competitions than during practices across all 3 surveillance systems for all injuries. Most cervical muscle strains were due to player contact (NCAA-ISP = 85.7%, HS NATION = 78.8%, HS RIO = 85.8%). Conclusions The incidence of cervical muscle strains in football players was low compared with other injuries. Nonetheless, identifying and implementing interventions, particularly those aimed at reducing unsafe player contact, are essential to further decrease the risk of injury and associated adverse outcomes.


Neurology ◽  
2018 ◽  
Vol 91 (23 Supplement 1) ◽  
pp. S26.1-S26
Author(s):  
Hoch Matthew ◽  
Curry Nicole ◽  
Hartley-Gabriel Emily ◽  
Heebner Nicholas ◽  
Hoch Johanna

Athletes with a history of concussion (HC) are at an increased risk of sustaining lower extremity injuries. It is unclear if these individuals exhibit dynamic postural control deficits associated with lower extremity injury risk. The purpose of this study was to determine if collegiate athletes with a HC demonstrate differences in Y-Balance Test (YBT) performance compared to athletes with no history of concussion (NHC). A total of 116 varsity and club athletes from a Division-I university participated. Forty participants reported a HC (female/male: 31/9, age: 20.0 ± 1.4 years, height: 169.3 ± 13.1 cm, mass: 68.4 ± 14.0 kg) while 76 reported NHC (female/male: 60/16, age: 20.0 ± 1.7 years, height: 168.5 ± 12.9 cm, mass: 68.7 ± 14.6 kg). Individuals with a current concussion or lower extremity injury, or a history of lower extremity surgery were excluded. Participants completed the YBT anterior reach direction barefoot on both limbs. The YBT was completed by maximally reaching anteriorly, maintaining balance, and returning to the starting position without errors. Participants completed 4 practice trials and 3 test trials. Reach distances were averaged and normalized to leg length. Between-limb asymmetry was calculated as the absolute difference between the left and right limbs. Separate independent t-tests examined group differences in normalized reach distances and asymmetry. The proportion of participants in each group with >4 cm of asymmetry was compared using a χ2 test. Alpha was set at 0.05 for all analyses. No group differences were identified in normalized reach distances for the left (HC: 61.4% ± 9.2%, NHC: 60.8% ± 6.2%, p = 0.88, ES = 0.08) or right (HC: 61.4% ± 6.2%, NHC: 60.2% ± 6.8%, p = 0.51, ES = 0.17) limbs. However, a greater proportion of HC participants demonstrated >4 cm asymmetry (HC: 40.0%, NHC: 19.7%; p = 0.02) and these participants exhibited greater asymmetry (HC: 3.87 ± 3.69 cm, NHC: 2.40 ± 2.13 cm, p = 0.03; ES = 0.53). Athletes with a HC exhibited greater asymmetry compared to athletes with NHC. Anterior reach asymmetries of >4 cm are associated with greater lower extremity injury risk. The YBT may provide a clinical technique to further explore the relationship between concussion and lower extremity injury.


2019 ◽  
Vol 47 (9) ◽  
pp. 2225-2231 ◽  
Author(s):  
Jordan J. Stares ◽  
Brian Dawson ◽  
Peter Peeling ◽  
Jarryd Heasman ◽  
Brent Rogalski ◽  
...  

Background: The risk of sustaining a subsequent injury is elevated in the weeks after return to play (RTP) from an index injury. However, little is known about the magnitude, duration, and nature by which subsequent injury risk is increased. Purpose: To quantify and describe the risk of injury in a 12-week period after RTP from an index injury in Australian football players. Study Design: Cohort study; Level of evidence, 2. Methods: Injury data were collected from 79 players over 5 years at 1 Australian Football League club. Injuries were classified with the Orchard Sports Injury Classification System and by side of the body. Furthermore, injury severity was classified as time loss (resulting in ≥1 matches being missed) or non–time loss (no matches missed). Subsequent injury was categorized with the SIC-2.0 model and applied to the data set via an automated script. The probability of a time loss subsequent injury was calculated for in-season index injuries for each week of a 12-week period after RTP via a mixed effect logistic regression model. Results: Subsequent injury risk was found to be highest in the week of RTP for both time loss injuries (9.4%) and non–time loss injuries (6.9%). Risk decreased with each week survived after RTP; however, it did not return to baseline risk of participation (3.6%). Conclusion: These findings demonstrate that athletes returning to play are at an increased risk of injury for a number of weeks, thus indicating the requirement for tertiary prevention strategies to ensure that they survive this period.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Han Byul Cho ◽  
Charles Elliott Bueler ◽  
Jennifer DiMuzio ◽  
Charlie Hicks-Little ◽  
Erin McGlade ◽  
...  

A number of studies have suggested that sports-related concussion (SRC) may place individuals at increased risk for depression and negative outcomes including suicide. However, the mechanisms underlying a potential relationship between brain integrity and mood remain unclear. The current study is aimed at examining the association between amygdala shape, mood state, and postconcussion symptoms in collegiate football players. Thirty members of 1 football team completed the Profile of Mood States (POMS), the postconcussion symptom scale (PCSS), and an MRI protocol during preseason camp. T1-weighted images were acquired and three-dimensional amygdala and probabilistic maps were created for shape analysis. Correlation analyses between POMS and PCSS and the relationship between POMS and amygdala shape were completed. In the amygdala, the left laterobasal subregion showed a positive relationship with the POMS total score and subscales scores. No significant relationship between PCSS and amygdala shape was found. Significant positive correlations were found between POMS subscales and PCSS. These results indicate that amygdala structure may be more closely associated with negative mood states than postconcussion symptoms. These findings suggest that premorbid individual differences in effect may provide critical insight into the relationship between negative mood and outcomes in collegiate football players with SRC.


2019 ◽  
Vol 54 (12) ◽  
pp. 731-738 ◽  
Author(s):  
Laura Bowen ◽  
Aleksander Stephan Gross ◽  
Mo Gimpel ◽  
Stewart Bruce-Low ◽  
Francois-Xavier Li

ObjectivesWe examined the relation between global positioning system (GPS)-derived workloads and injury in English Premier League football players (n=33) over three seasons.MethodsWorkload and injury data were collected over three consecutive seasons. Cumulative (1-weekly, 2-weekly, 3-weekly and 4-weekly) loads in addition to acute:chronic workload ratios (ACWR) (acute workload (1-week workload)) divided by chronic workload (previous 4-week average acute workload) were classified into discrete ranges by z-scores. Relative risk (RR) for each range was then calculated between injured and non-injured players using specific GPS variables: total distance, low-intensity distance, high-speed running distance, sprint distance, accelerations and decelerations.ResultsThe greatest non-contact injury risk was when the chronic exposure to decelerations was low (<1731) and the ACWR was >2.0 (RR=6.7). Non-contact injury risk was also 5–6 times higher for accelerations and low-intensity distance when the chronic workloads were categorised as low and the ACWR was >2.0 (RR=5.4–6.6), compared with ACWRs below this. When all chronic workloads were included, an ACWR >2.0 was associated with a significant but lesser injury risk for the same metrics, plus total distance (RR=3.7–3.9).ConclusionsWe recommend that practitioners involved in planning training for performance and injury prevention monitor the ACWR, increase chronic exposure to load and avoid spikes that approach or exceed 2.0.


Sign in / Sign up

Export Citation Format

Share Document