Assessment and Training of Visuomotor Reaction Time for Football Injury Prevention

2017 ◽  
Vol 26 (1) ◽  
pp. 26-34 ◽  
Author(s):  
Gary B. Wilkerson ◽  
Kevin A. Simpson ◽  
Ryan A. Clark

Context:Neurocognitive reaction time has been associated with musculoskeletal injury risk, but visuomotor reaction time (VMRT) derived from tests that present greater challenges to visual stimulus detection and motor response execution may have a stronger association.Objective:To assess VMRT as a predictor of injury and the extent to which improvement may result from VMRT training.Design:Cohort study.Setting:University athletic performance center.Participants:76 National Collegiate Athletic Association Division-I FCS football players (19.5 ± 1.4 y, 1.85 ± 0.06 m, 102.98 ± 19.06 kg).Interventions:Preparticipation and postseason assessments. A subset of players who exhibited slowest VMRT in relation to the cohort’s postseason median value participated in a 6-wk training program.Main Outcome Measures:Injury occurrence was related to preparticipation VMRT, which was represented by both number of target hits in 60 s and average elapsed time between hits (ms). Receiver operating characteristic analysis identified the optimum cut point for a binary injury risk classification. A nonparametric repeated-measures analysis of ranks procedure was used to compare posttraining VMRT values for slow players who completed at least half of the training sessions (n = 15) with those for untrained fast players (n = 27).Results:A preparticipation cut point of ≤85 hits (≥705 ms) discriminated injured from noninjured players with odds ratio = 2.30 (90% confidence interval, 1.05–5.06). Slow players who completed the training exhibited significant improvement in visuomotor performance compared with baseline (standardized response mean = 2.53), whereas untrained players exhibited a small performance decrement (group × trial interaction effect, L2 = 28.74; P < .001).Conclusions:Slow VMRT appears to be an important and modifiable injury risk factor for college football players. More research is needed to refine visuomotor reaction-time screening and training methods and to determine the extent to which improved performance values can reduce injury incidence.

2012 ◽  
Vol 17 (6) ◽  
pp. 4-9 ◽  
Author(s):  
Gary B. Wilkerson

Context:Prevention of a lower extremity sprain or strain requires some basis for predicting that an individual athlete will sustain such an injury unless a modifiable risk factor is addressed.Objective:To assess the possible existence of an association between reaction time measured during completion of a computerized neurocognitive test battery and subsequent occurrence of a lower extremity sprain or strain.Design:Prospective cohort study.Setting:Preparticipation screening conducted in a computer laboratory on the day prior to initiation of preseason practice sessions.Participants:76 NCAA Division I-FCS football players.Main Outcome Measures:Lower extremity sprains and strains sustained between initiation of preseason practice sessions and the end of an 11-game season. Receiver operating characteristic analysis identified the optimal reaction time cut-point for discrimination between injured versus noninjured status. Stratified analyses were performed to evaluate any differential influence of reaction time on injury incidence between starters and nonstarters.Results:A total of 29 lower extremity sprains and strains were sustained by 23 of the 76 players. A reaction time cut-point of ≥ .545 s provided good discrimination between injured and noninjured cases: 74% sensitivity, 51% specificity, relative risk = 2.17 (90% CI: 1.10, 4.30), and odds ratio = 2.94 (90% CI: 1.19, 7.25).Conclusions:Neurocognitive reaction time appears to be an indicator of elevated risk for lower extremity sprains and strains among college football players, which may be modifiable through performance of exercises designed to accelerate neurocognitive processing of visual input.


2014 ◽  
Vol 16 (1) ◽  
pp. 17-28 ◽  
Author(s):  
Martin Campbell

Purpose – The purpose of this paper is to measure nurses’ knowledge about Adult Support and Protection (Scotland) Act 2007 before and after a one-day training course using participants’ favoured methods of training activities. Design/methodology/approach – A repeated measures design was used to evaluate the impact of a one-day Adult Support and Protection training on pre-training knowledge of community nurses across one NHS area. Participants’ favoured methods of training activities were used in the training. Participants were community nurses working in learning disability, mental health, older people's services, acute services, substance misuse, and accident and emergency. All completed a training needs analysis and training preferences study. Individual and group scores on an Adult Support and Protection knowledge questionnaire were analysed pre- and post-training. Findings – There was a statistically significant increase in scores post-training (Wilcoxon's signed-ranks test). Individual increases ranged from 2.5 to 27.5 per cent, with a mean score of 15 per cent. Evaluation of the impact of nationally approved Adult Support and Protection training is needed and training should take account of participants’ existing knowledge and preferred methods of training delivery to improve the transfer of learning into practice. Research limitations/implications – Participants were self-selecting. Existing knowledge was not controlled for in the sample. No longitudinal follow up to measure retention of any improvements in knowledge. No control group. Training methods used were based on the expressed preferences of 40 nursing staff, but only 18 of these staff participated in the training day. Originality/value – There is a dearth of research in evaluating the impact of the adult protection training on staff knowledge and understanding. Designing training activities and content to take account of participant preferences, and areas where knowledge is weakest may enhance the effectiveness of training in this area. This research was funded as a Queens Nursing Institute Community Project. It builds on a pilot project


2015 ◽  
Vol 24 (3) ◽  
pp. 293-299 ◽  
Author(s):  
Kazem Malmir ◽  
Gholam Reza Olyaei ◽  
Saeed Talebian ◽  
Ali Ashraf Jamshidi

Context:Cyclic movements and muscle fatigue may result in musculoskeletal injuries by inducing changes in neuromuscular control. Ankle frontal-plane neuromuscular control has rarely been studied in spite of its importance.Objective:To compare the effects of peroneal muscle fatigue and a cyclic passive-inversion (CPI) protocol on ankle neuromuscular control during a lateral hop.Design:Quasi-experimental, repeated measures.Setting:University laboratory.Participants:22 recreationally active, healthy men with no history of ankle sprain or giving way.Interventions:Participants performed a lateral hop before and after 2 interventions on a Biodex dynamometer. They were randomly assigned to intervention order and interventions were 1 wk apart. A passive intervention included 40 CPIs at 5°/s through 80% of maximum range of motion, and a fatigue intervention involved an isometric eversion at 40% of the maximal voluntary isometric contraction until the torque decreased to 50% of its initial value.Main Outcome Measures:Median frequency of the peroneus longus during the fatigue protocol, energy absorption by the viscoelastic tissues during the CPI protocol, and feedforward onset and reaction time of the peroneus longus during landing.Results:A significant fall in median frequency (P < .05) and a significant decrease in energy absorption (P < .05) confirmed fatigue and a change in viscoelastic behavior, respectively. There was a significant main effect of condition on feedforward onset and reaction time (P < .05). No significant main effect of intervention or intervention × condition interaction was noted (P > .05). There was a significant difference between pre- and postintervention measures (P < .0125), but no significant difference was found between postintervention measures (P > .0125).Conclusions:Both fatigue and the CPI may similarly impair ankle neuromuscular control. Thus, in prolonged sports competitions and exercises, the ankle may be injured due to either fatigue or changes in the biomechanical properties of the viscoelastic tissues.


2017 ◽  
Vol 26 (6) ◽  
pp. 536-543
Author(s):  
Brandon M. Ness ◽  
Kory Zimney ◽  
William E. Schweinle

Context:Injury risk factors and relevant assessments have been identified in women’s soccer athletes. Other tests assess fitness (eg, the Gauntlet Test [GT]). However, little empirical support exists for the utility of the GT to predict time loss injury.Objectives:To examine the GT as a predictor of injury in intercollegiate Division I female soccer athletes.Design:Retrospective, nonexperimental descriptive cohort study.Setting:College athletic facilities.Participants:71 female Division I soccer athletes (age 19.6 ± 1.24 y, BMI 23.0 ± 2.19).Main Outcome Measures:GT, demographic, and injury data were collected over 3 consecutive seasons. GT trials were administered by coaching staff each preseason. Participation in team-based activities (practices, matches) was restricted until a successful GT trial. Soccer-related injuries that resulted in time loss from participation were recorded.Results:71 subjects met the inclusion criteria, with 12 lower body time loss injuries sustained. Logistic regression models indicated that with each unsuccessful GT attempt, the odds of sustaining an injury increased by a factor of 3.5 (P < .02). The Youden index was 2 GT trials for success, at which sensitivity = .92 and specificity = .46. For successive GT trials before success (1, 2, or 3), the predicted probabilities for injury were .063, .194, and .463, respectively.Conclusions:The GT appears to be a convenient and predictive screen for potential lowerbody injuries among female soccer athletes in this cohort. Further investigation into the appropriate application of the GT for injury prediction is warranted given the scope of this study.


2015 ◽  
Vol 47 ◽  
pp. 11
Author(s):  
Vincent C. Nittoli ◽  
Tracy A. Dierks ◽  
Michael D. Justiss ◽  
Gary B. Wilkerson

2017 ◽  
Vol 12 (6) ◽  
pp. 749-755 ◽  
Author(s):  
Nick B. Murray ◽  
Tim J. Gabbett ◽  
Andrew D. Townshend

Objectives:To investigate the relationship between the proportion of preseason training sessions completed and load and injury during the ensuing Australian Football League season.Design:Single-cohort, observational study.Methods:Forty-six elite male Australian football players from 1 club participated. Players were divided into 3 equal groups based on the amount of preseason training completed (high [HTL], >85% sessions completed; medium [MTL], 50–85% sessions completed; and low [LTL], <50% sessions completed). Global positioning system (GPS) technology was used to record training and game loads, with all injuries recorded and classified by club medical staff. Differences between groups were analyzed using a 2-way (group × training/competition phase) repeated-measures ANOVA, along with magnitude-based inferences. Injury incidence was expressed as injuries per 1000 h.Results:The HTL and MTL groups completed a greater proportion of in-season training sessions (81.1% and 74.2%) and matches (76.7% and 76.1%) than the LTL (56.9% and 52.7%) group. Total distance and player load were significantly greater during the first half of the in-season period for the HTL (P = .03, ES = 0.88) and MTL (P = .02, ES = 0.93) groups than the LTL group. The relative risk of injury for the LTL group (26.8/1000 h) was 1.9 times greater than that for the HTL group (14.2/1000 h) (χ2 = 3.48, df = 2, P = .17).Conclusions:Completing a greater proportion of preseason training resulted in higher training loads and greater participation in training and competition during the competitive phase of the season.


2017 ◽  
Vol 26 (1) ◽  
pp. 8-14 ◽  
Author(s):  
Samantha E. Scarneo ◽  
Hayley J. Root ◽  
Jessica C. Martinez ◽  
Craig Denegar ◽  
Douglas J. Casa ◽  
...  

Context:Neuromuscular training programs (NTPs) improve landing technique and decrease vertical ground-reaction forces (VGRFs), resulting in injury-risk reduction. NTPs in an aquatic environment may elicit the same improvements as land-based programs with reduced joint stress.Objective:To examine the effects of an aquatic NTP on landing technique as measured by the Landing Error Scoring System (LESS) and VGRFs, immediately and 4 mo after the intervention.Design and Setting:Repeated measures, pool and laboratory.Participants:Fifteen healthy, recreationally active women (age 21 ± 2 y, mass 62.02 ± 8.18 kg, height 164.74 ± 5.97 cm) who demonstrated poor landing technique (LESS-Real Time > 4).Interventions:All participants completed an aquatic NTP 3 times/wk for 6 wk.Main Outcome Measures:Participants’ landing technique was evaluated using a jump-landing task immediately before (PRE), immediately after (POST), and 4 mo after (RET) the intervention period. A single rater, blinded to time point, graded all videos using the LESS, which is a valid and reliable movement-screening tool. Peak VGRFs were measured during the stance phase of the jump-landing test. Repeated-measure analyses of variance with planned comparisons were performed to explore differences between time points.Results:LESS scores were lower at POST (4.46 ± 1.69 errors) and at RET (4.2 ± 1.72 errors) than at PRE (6.30 ± 1.78 errors) (P < .01). No significant differences were observed between POST and RET (P > .05). Participants also landed with significantly lower peak VGRFs (P < .01) from PRE (2.69 ± .72 N) to POST (2.23 ± .66 N).Conclusions:The findings introduce evidence that an aquatic NTP improves landing technique and suggest that improvements are retained over time. These results show promise of using an aquatic NTP when there is a desire to reduce joint loading, such as early stages of rehabilitation, to improve biomechanics and reduce injury risk.


2015 ◽  
Vol 24 (4) ◽  
pp. 349-352 ◽  
Author(s):  
Giuliamarta Bottoni ◽  
Dieter Heinrich ◽  
Philipp Kofler ◽  
Michael Hasler ◽  
Werner Nachbauer

Context:During sport activity, knee proprioception might worsen. This decrease in proprioceptive acuity negatively influences motor control and therefore may increase injury risk. Hiking is a common activity characterized by a higher-intensity-exercise phase during uphill walking and a lower-intensity-exercise phase during downhill walking. Pain and injuries are reported in hiking, especially during the downhill phase.Objective:To examine the effect of a hiking-fatigue protocol on joint-position sense.Design:Repeated measures.Setting:University research laboratory.Participants:24 nonprofessional sportswomen without knee injuries.Main Outcome Measures:Joint-position sense was tested at the beginning, after 30 min uphill walking, and after 30 min downhill walking on a treadmill (continuous protocol).Results:After downhill walking, joint-position sense was significantly worse than in the test at the beginning (P = .035, α = .05). After uphill walking, no differences were observed in comparison with the test at the beginning (P = .172, α = .05) or the test after downhill walking (P = .165, α = .05).Conclusion:Downhill walking causes impairment in knee-joint-position sense. Considering these results, injury-prevention protocols for hiking should focus on maintaining and improving knee proprioception during the descending phase.


2019 ◽  
Vol 14 (4) ◽  
pp. 498-506
Author(s):  
Cameron S Dyer ◽  
Robin Callister ◽  
Colin E Sanctuary ◽  
Suzanne J Snodgrass

Research is limited as to whether Functional Movement Screen scores relate to non-contact injury risk in rugby league players. This cohort study investigates whether the Functional Movement Screen score predicts non-contact injuries in elite adolescent rugby league players. Australian adolescent rugby league players ( n = 52; mean age 16.0 ± 1.0 years) from one club participated in this study. Functional Movement Screen scores, height, and mass were collected at the beginning of the preseason. Training, match exposure, and injury incidence data (non-contact match and training injuries with three levels of severity) were recorded for each individual athlete throughout the season. Linear and logistic regression analyses were conducted to investigate the association between Functional Movement Screen score (continuous score, ≤ 14 or > 14, and three subscores) and injury risk, whilst controlling for exposure time. The mean Functional Movement Screen score for the sample was 13.4 (95% CI: 11.0–14.0). A total of 72 non-contact injuries were recorded (incidence rate: 18.7 per 1000 exposure hours; 95% CI: 11.6–24.8). There were no statistically significant associations between non-contact injury and Functional Movement Screen score for any of the analyses conducted. Our results suggest that the Functional Movement Screen does not reflect non-contact injury risk in elite adolescent rugby league players. Further research should investigate whether a more sport-specific movement screen in the preseason can more effectively predict injury risk in this population.


2017 ◽  
Vol 26 (5) ◽  
pp. 386-395 ◽  
Author(s):  
Candice Martin ◽  
Benita Olivier ◽  
Natalie Benjamin

Context:The Functional Movement Screen (FMS) has been found to be a valid preparticipation screening tool in the prediction of injury among various athletes in different sports. The validity thereof in the prediction of injury among adolescent cricketers is yet to be established.Objective:To determine if a preseason FMS total score is a valid predictor of in-season injury among adolescent pace bowlers.Design:Prospective observational quantitative study.Setting:Bowlers performed the FMS before the start of the season. Injury incidence was monitored monthly throughout the season. The student t test and Fisher’s exact test were used to compare the FMS scores of the injured and noninjured bowlers as well as the injured and noninjured bowlers who scored ≤ 14.Participants:27 injury-free, male, adolescent pace bowlers.Main Outcome Measures:The FMS (scoring criteria and score sheet) and standardized self-administered injury questionnaire.Results:There was no difference between the noninjured group (16.55 ± 2.57) and the injured (16.1 ± 2.07) group in terms of FMS scores. There was no significant difference between injured and noninjured bowlers who scored ≤ 14. A total FMS score of 14 does not provide the sensitivity needed to assess injury risk among adolescent pace bowlers and no other accurate cut-off score could be calculated.Conclusion:Preseason observed total FMS score is a poor predictor of in-season injury among adolescent pace bowlers. Further research should be conducted to determine if a specific FMS test will be a more valid predictor of injury.


Sign in / Sign up

Export Citation Format

Share Document