The Association Between Neurocognitive Reaction Time and Injury Risk in Adolescent Football Players

2015 ◽  
Vol 47 ◽  
pp. 11
Author(s):  
Vincent C. Nittoli ◽  
Tracy A. Dierks ◽  
Michael D. Justiss ◽  
Gary B. Wilkerson
2017 ◽  
Vol 26 (1) ◽  
pp. 26-34 ◽  
Author(s):  
Gary B. Wilkerson ◽  
Kevin A. Simpson ◽  
Ryan A. Clark

Context:Neurocognitive reaction time has been associated with musculoskeletal injury risk, but visuomotor reaction time (VMRT) derived from tests that present greater challenges to visual stimulus detection and motor response execution may have a stronger association.Objective:To assess VMRT as a predictor of injury and the extent to which improvement may result from VMRT training.Design:Cohort study.Setting:University athletic performance center.Participants:76 National Collegiate Athletic Association Division-I FCS football players (19.5 ± 1.4 y, 1.85 ± 0.06 m, 102.98 ± 19.06 kg).Interventions:Preparticipation and postseason assessments. A subset of players who exhibited slowest VMRT in relation to the cohort’s postseason median value participated in a 6-wk training program.Main Outcome Measures:Injury occurrence was related to preparticipation VMRT, which was represented by both number of target hits in 60 s and average elapsed time between hits (ms). Receiver operating characteristic analysis identified the optimum cut point for a binary injury risk classification. A nonparametric repeated-measures analysis of ranks procedure was used to compare posttraining VMRT values for slow players who completed at least half of the training sessions (n = 15) with those for untrained fast players (n = 27).Results:A preparticipation cut point of ≤85 hits (≥705 ms) discriminated injured from noninjured players with odds ratio = 2.30 (90% confidence interval, 1.05–5.06). Slow players who completed the training exhibited significant improvement in visuomotor performance compared with baseline (standardized response mean = 2.53), whereas untrained players exhibited a small performance decrement (group × trial interaction effect, L2 = 28.74; P < .001).Conclusions:Slow VMRT appears to be an important and modifiable injury risk factor for college football players. More research is needed to refine visuomotor reaction-time screening and training methods and to determine the extent to which improved performance values can reduce injury incidence.


2017 ◽  
Vol 12 (3) ◽  
pp. 393-401 ◽  
Author(s):  
Shane Malone ◽  
Mark Roe ◽  
Dominic A. Doran ◽  
Tim J. Gabbett ◽  
Kieran D. Collins

Purpose:To examine the association between combined session rating of perceived exertion (RPE) workload measures and injury risk in elite Gaelic footballers.Methods:Thirty-seven elite Gaelic footballers (mean ± SD age 24.2 ± 2.9 y) from 1 elite squad were involved in a single-season study. Weekly workload (session RPE multiplied by duration) and all time-loss injuries (including subsequent-wk injuries) were recorded during the period. Rolling weekly sums and wk-to-wk changes in workload were measured, enabling the calculation of the acute:chronic workload ratio by dividing acute workload (ie, 1-weekly workload) by chronic workload (ie, rolling-average 4-weekly workload). Workload measures were then modeled against data for all injuries sustained using a logistic-regression model. Odds ratios (ORs) were reported against a reference group.Results:High 1-weekly workloads (≥2770 arbitrary units [AU], OR = 1.63–6.75) were associated with significantly higher risk of injury than in a low-training-load reference group (<1250 AU). When exposed to spikes in workload (acute:chronic workload ratio >1.5), players with 1 y experience had a higher risk of injury (OR = 2.22) and players with 2–3 (OR = 0.20) and 4–6 y (OR = 0.24) of experience had a lower risk of injury. Players with poorer aerobic fitness (estimated from a 1-km time trial) had a higher injury risk than those with higher aerobic fitness (OR = 1.50–2.50). An acute:chronic workload ratio of (≥2.0) demonstrated the greatest risk of injury.Conclusions:These findings highlight an increased risk of injury for elite Gaelic football players with high (>2.0) acute:chronic workload ratios and high weekly workloads. A high aerobic capacity and playing experience appears to offer injury protection against rapid changes in workload and high acute:chronic workload ratios. Moderate workloads, coupled with moderate to high changes in the acute:chronic workload ratio, appear to be protective for Gaelic football players.


2012 ◽  
Vol 47 (3) ◽  
pp. 264-272 ◽  
Author(s):  
Gary B. Wilkerson ◽  
Jessica L. Giles ◽  
Dustin K. Seibel

Context: Poor core stability is believed to increase vulnerability to uncontrolled joint displacements throughout the kinetic chain between the foot and the lumbar spine. Objective: To assess the value of preparticipation measurements as predictors of core or lower extremity strains or sprains in collegiate football players. Design: Cohort study. Setting: National Collegiate Athletic Association Division I Football Championship Subdivision football program. Patients or Other Participants: All team members who were present for a mandatory physical examination on the day before preseason practice sessions began (n  =  83). Main Outcome Measure(s): Preparticipation administration of surveys to assess low back, knee, and ankle function; documentation of knee and ankle injury history; determination of body mass index; 4 different assessments of core muscle endurance; and measurement of step-test recovery heart rate. All injuries were documented throughout the preseason practice period and 11-game season. Receiver operating characteristic analysis and logistic regression analysis were used to identify dichotomized predictive factors that best discriminated injured from uninjured status. The 75th and 50th percentiles were evaluated as alternative cutpoints for dichotomization of injury predictors. Results: Players with ≥2 of 3 potentially modifiable risk factors related to core function had 2 times greater risk for injury than those with &lt;2 factors (95% confidence interval  =  1.27, 4.22), and adding a high level of exposure to game conditions increased the injury risk to 3 times greater (95% confidence interval  =  1.95, 4.98). Prediction models that used the 75th and 50th percentile cutpoints yielded results that were very similar to those for the model that used receiver operating characteristic-derived cutpoints. Conclusions: Low back dysfunction and suboptimal endurance of the core musculature appear to be important modifiable football injury risk factors that can be identified on preparticipation screening. These predictors need to be assessed in a prospective manner with a larger sample of collegiate football players.


2017 ◽  
Vol 01 (02) ◽  
pp. E69-E73
Author(s):  
Nikolas Knudsen ◽  
Thomas Andersen

AbstractThe purpose of this study was to evaluate 3 different starting techniques from the staggered stance with regards to sprint time, reaction time, linear impulse and power. 11 male amateur American football players volunteered to participate in a testing session consisting of twelve 5 m sprints, 4 in each technique (normal (NORM), backwards false step (BFS) and forwards false step (FFS)) in random order. Sprint starts were performed on force plates to investigate ground reaction forces, reaction time and total sprint time. Analysis showed significant differences in sprint times, with NORM (1.77±0.10 s) being faster than FFS (1.81±0.12 s) and BFS (2.01±0.13 s), and FFS being faster than BFS, although no differences were found in reaction time. In terms of mean force and power, NORM (331.1±39.2N, 542.2±72.3W) and FFS (320.8±43.2N, 550.9±81.4W) were significantly larger than BFS (256.9±36.2N, 443.5±61.1W). This indicates that when starting from a staggered stance, the BFS is inferior to the others and should be avoided. However, since the force profiles of the NORM and the FFS were similar, the differences in sprint time could arise from a technique bias towards the NORM start.


2020 ◽  
Vol 52 (8) ◽  
pp. 1745-1751 ◽  
Author(s):  
NIKKI ROMMERS ◽  
ROLAND RÖSSLER ◽  
EVERT VERHAGEN ◽  
FLORIAN VANDECASTEELE ◽  
STEVEN VERSTOCKT ◽  
...  

2016 ◽  
Vol 18 (1) ◽  
pp. 65-72 ◽  
Author(s):  
Doug A. King ◽  
Patria A. Hume ◽  
Conor Gissane ◽  
Trevor N. Clark

OBJECTIVE Direct impact with the head and the inertial loading of the head have been postulated as major mechanisms of head-related injuries, such as concussion. METHODS This descriptive observational study was conducted to quantify the head impact acceleration characteristics in under-9-year-old junior rugby union players in New Zealand. The impact magnitude, frequency, and location were collected with a wireless head impact sensor that was worn by 14 junior rugby players who participated in 4 matches. RESULTS A total of 721 impacts > 10g were recorded. The median (interquartile range [IQR]) number of impacts per player was 46 (IQR 37–58), resulting in 10 (IQR 4–18) impacts to the head per player per match. The median impact magnitudes recorded were 15g (IQR 12g–21g) for linear acceleration and 2296 rad/sec2 (IQR 1352–4152 rad/sec2) for rotational acceleration. CONCLUSIONS There were 121 impacts (16.8%) above the rotational injury risk limit and 1 (0.1%) impact above the linear injury risk limit. The acceleration magnitude and number of head impacts in junior rugby union players were higher than those previously reported in similar age-group sports participants. The median linear acceleration for the under-9-year-old rugby players were similar to 7- to 8-year-old American football players, but lower than 9- to 12-year-old youth American football players. The median rotational accelerations measured were higher than the median and 95th percentiles in youth, high school, and collegiate American football players.


2017 ◽  
Vol 51 (4) ◽  
pp. 392.1-392
Author(s):  
Kathrin Steffen ◽  
Agnethe Nilstad ◽  
Tron Krosshaug ◽  
Kati Pasanen ◽  
Roald Bahr

Author(s):  
David A. K. Kalapa

Football players experience impacts to the head, some of which cause mild traumatic brain injuries known as concussions. Players wear helmets to reduce injury risk, and this study compares two helmets to determine their effectiveness in reducing potential concussions. The helmets analyzed are a “classic” type made of large foam pad pieces, and a “new” type made of small honeycomb pads. Both helmets share the same external polycarbonate shell & padding materials. Three helmet to helmet collisions are studied: case one: “classic on classic,” case two: “classic on new,” & case three: “new on new.” Using finite element analysis method, stresses and contact pressures are calculated. For three collisions with the same forces applied, a player in case one experiences 0.96 MPa contact pressure at the skull, while a player in case three experiences 0.87 MPa. In case two the player wearing the “new” helmet is exposed to 0.9 MPa at the skull, while the player wearing the “classic” is exposed to 0.95 MPa at the skull. It is concluded that if a player uses a “classic” instead of “new” helmet, pressure on the skull is reduced by 9.4%, reducing the risk of that player sustaining a concussion.


Sign in / Sign up

Export Citation Format

Share Document