scholarly journals Between-Seasons Test-Retest Reliability of Clinically Measured Reaction Time in National Collegiate Athletic Association Division I Athletes

2011 ◽  
Vol 46 (4) ◽  
pp. 409-414 ◽  
Author(s):  
James T. Eckner ◽  
Jeffrey S. Kutcher ◽  
James K. Richardson

Context: Reaction time is typically impaired after concussion. A clinical test of reaction time (RTclin) that does not require a computer to administer may be a valuable tool to assist in concussion diagnosis and management. Objective: To determine the test-retest reliability of RTclin measured over successive seasons in competitive collegiate athletes and to compare these results with a computerized measure of reaction time (RTcomp). Design: Case series with repeated measures. Setting: Preparticipation physical examinations for the football, women's soccer, and wrestling teams at a single university. Patients or Other Participants: 102 National Collegiate Athletic Association Division I athletes. Intervention(s): The RTclin was measured using a measuring stick embedded in a weighted rubber disk that was released and caught as quickly as possible. The RTcomp was measured using the simple reaction time component of CogState Sport. Main Outcome Measure(s): Data were collected at 2 time points, 1 season apart, during preparticipation physical examinations. Outcomes were mean simple RTclin and RTcomp. Results: The intraclass correlation coefficient estimates from season 1 to season 2 were 0.645 for RTclin (n = 102, entire sample) and 0.512 for RTcomp (n = 62 athletes who had 2 consecutive valid baseline CogState Sport test sessions). Conclusions: The test-retest reliability of RTclin over consecutive seasons compared favorably with that of a concurrently tested computerized measure of reaction time and with literature-based estimates of computerized reaction time measures. This finding supports the potential use of RTclin as part of a multifaceted concussion assessment battery. Further prospective study is warranted.

2021 ◽  
pp. 1-9
Author(s):  
Adam J. Wells ◽  
Bri-ana D.I. Johnson

Context: The Dynavision D2™ Mode A test (ModeA) is a 1-minute reaction time (RT) test commonly used in sports science research and clinical rehabilitation. However, there is limited data regarding the effect of repeated testing (ie, training) or subsequent periods of no testing (ie, detraining) on test–retest reliability and RT performance. Therefore, the purpose of this study was to examine the test–retest reliability, training, and detraining effects associated with the D2™ ModeA test. Design: Repeated measures/reliability. Methods: Twenty-four recreationally active men and women completed 15 training sessions consisting of 2 ModeA tests per session (30 tests). The participants were then randomized to either 1 or 2 weeks of detraining prior to completing 15 retraining sessions (30 tests). The training and retraining periods were separated into 10 blocks for analysis (3 tests per block). The number of hits (hits) and the average RT per hit (AvgRT) within each block were used to determine RT performance. Intraclass correlation coefficients, SEM, and minimum difference were used to determine reliability. Repeated-measures analysis of variance/analysis of covariance were used to determine training and detraining effects, respectively. Results: The ModeA variables demonstrated excellent test–retest reliability (intraclass correlation coefficient2,3 > .93). Significant improvements in hits and AvgRT were noted within training blocks 1 to 5 (P < .05). No further improvements in RT performance were noted between training blocks 6 through 10. There was no effect of detraining period on RT. The RT performance was not different between blocks during retraining. Conclusions: It appears that 15 tests are necessary to overcome the training effect and establish reliable baseline performance for the ModeA test. Detraining for 1 to 2 weeks did not impact RT performance. The authors recommend that investigators and clinicians utilize the average of 3 tests when assessing RT performance using the D2 ModeA test.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Kathleen A. Holoyda ◽  
Daniel P. Donato ◽  
David A. Magno-Padron ◽  
Andrew M. Simpson ◽  
Jayant P. Agarwal

Abstract Background The rates, severity and consequences of hand and wrist injuries sustained by National Collegiate Athletic Association athletes are not well characterized. This study describes the epidemiology of hand and wrist injuries among collegiate athletes competing in different divisions. Methods The National Collegiate Athletic Association Injury Surveillance Program (NCAA-ISP) was accessed from 2004 to 2015 for the following sports: baseball, basketball, football, ice hockey, lacrosse, soccer, wrestling, field hockey, gymnastics, softball and volleyball. The data were used to identify all hand and wrist injuries, the specific injury diagnosis, mean time loss of activity following injury, and need for surgery following injury. These were then stratified by gender. Descriptive statistics were performed to examine the association between sports, event type and division. Student's t test was used to calculate p-values for independent variables. Chi-Square test was used to calculate odds ratio. P < 0.05 was considered significant. Results 103,098 hand and wrist injuries were reported in in the studied NCAA sports from 2004 to 2015. Male athletes sustained 72,423 injuries (6.01/10,000 athlete exposure) and female athletes sustained 30,675 injuries (4.13/10,000 athlete exposure). Division I athletes sustained significantly more injuries compared to divisions II and III. Overall, 3.78% of hand and wrist injuries required surgical intervention. A significantly higher percentage of division I athletes (both male and female) underwent surgical intervention compared to divisions II and III. The mean time lost due to hand and wrist injury was 7.14 days for all athletes. Division I athletes missed the fewest days due to injury at 6.29 days though this was not significant. Conclusions Hand and wrist injuries are common among collegiate athletes. Division I athletes sustain higher rates of injuries and higher surgical intervention rates, while tending to miss fewer days due to injury. Improved characterization of divisional differences in hand and wrist injuries can assist injury management and prevention.


2010 ◽  
Vol 45 (4) ◽  
pp. 327-332 ◽  
Author(s):  
James T. Eckner ◽  
Jeffrey S. Kutcher ◽  
James K. Richardson

Abstract Context: Evidence suggests that concussion prolongs reaction time (RT). We have developed a simple, reliable clinical tool for measuring reaction time that may be of value in the assessment of concussion in athletes. Objective: To compare baseline values of clinical RT (RTclin) obtained using the new clinical reaction time apparatus with computerized RT (RTcomp) obtained using a validated computerized neuropsychological test battery. Design: Cross-sectional study. Setting: Data were collected during a National Collegiate Athletic Association Division I collegiate football team's preparticipation physical examination session. Patients or Other Participants: Ninety-four Division I collegiate football players. Main Outcome Measure(s): The RTclin was measured using a 1.3-m measuring stick embedded in a weighted rubber disk that was released and caught as quickly as possible. The RTcomp was measured using the simple RT component of CogState Sport. Results: For the 68 athletes whose CogState Sport tests passed the program's integrity check, RTclin and RTcomp were correlated (r  =  0.445, P &lt; .001). Overall, mean RTclin was shorter and less variable than mean RTcomp (203 ± 20 milliseconds versus 268 ± 44 milliseconds; P &lt; .001). When RTclin and RTcomp were compared between those athletes with (n  =  68) and those without (n  =  26) valid CogState Sport test sessions, mean RTclin was similar (202 ± 19 milliseconds versus 207 ± 23 milliseconds; P  =  .390), but mean RTcomp was different (258 ± 35 milliseconds versus 290 ± 55 milliseconds; P  =  .009). Conclusions: The RTclin was positively correlated with RTcomp and yielded more consistent reaction time values during baseline testing. Given that RTclin is easy to measure using simple, inexpensive equipment, further prospective study is warranted to determine its clinical utility in the assessment of concussion in athletes.


2021 ◽  
Vol 9 (2) ◽  
pp. 232596712098228 ◽  
Author(s):  
Biagio Zampogna ◽  
Sebastiano Vasta ◽  
Guglielmo Torre ◽  
Akhil Gupta ◽  
Carolyn M. Hettrich ◽  
...  

Background: Anterior cruciate ligament (ACL) tears are common in collegiate athletes. The rate of return to the preinjury level of sport activities after ACL reconstruction continues to evolve. Purpose/Hypothesis: The purpose was to determine the return-to-sport rate after ACL reconstruction in a cohort of National Collegiate Athletic Association Division I athletes in different sports. It was hypothesized that, with intensive supervision of rehabilitation, the return-to-sport rate would be optimal. Study Design: Case series; Level of evidence, 4. Methods: We retrospectively reviewed the records of 75 collegiate athletes from a single institution who had undergone unilateral or bilateral ACL reconstruction between 2001 and 2013 and participated in an extensive supervised rehabilitation program. Prospectively collected athlete data as well as data about preinjury exposure, associated lesions, surgical technique, time lost to injury, number of games missed, time to return to full sport activity or retire, and subsequent surgical procedures were extracted from the medical and athletic trainer records. Results: The 75 patients (40 male, 35 female; mean age, 20.1 years) underwent 81 reconstruction procedures (73 primary, 8 revision). The mean follow-up was 19.3 months. The overall return-to-sport rate was 92%. After reconstruction, 9 athletes (12%) retired from collegiate sports, but 3 of them returned to sport activities after graduation. Overall, 8 athletes (11%) experienced an ACL graft retear. Conclusion: The return-to-sport rate in our National Collegiate Athletic Association Division I athletes compared favorably with that reported in other studies in the literature. The strict follow-up by the surgeon, together with the high-profile, almost daily technical and psychological support given mainly by the athletic trainers during the recovery period, may have contributed to preparing the athletes for a competitive rate of return to sport at their preinjury level.


Sports ◽  
2018 ◽  
Vol 6 (4) ◽  
pp. 133 ◽  
Author(s):  
Christopher Sole ◽  
Timothy Suchomel ◽  
Michael Stone

The purpose of this analysis was to construct a preliminary scale of reference values for reactive strength index-modified (RSImod). Countermovement jump data from 151 National Collegiate Athletic Association (NCAA) Division I collegiate athletes (male n = 76; female n = 75) were analyzed. Using percentiles, scales for both male and female samples were constructed. For further analysis, athletes were separated into four performance groups based on RSImod and comparisons of jump height (JH), and time to takeoff (TTT) were performed. RSImod values ranged from 0.208 to 0.704 and 0.135 to 0.553 in males and females, respectively. Males had greater RSImod (p < 0.001, d = 1.15) and JH (p < 0.001, d = 1.41) as compared to females. No statistically significant difference was observed for TTT between males and females (p = 0.909, d = 0.02). Only JH was found to be statistically different between all performance groups. For TTT no statistical differences were observed when comparing the top two and middle two groups for males and top two, bottom two, and middle two groups for females. Similarities in TTT between sexes and across performance groups suggests JH is a primary factor contributing to differences in RSImod. The results of this analysis provide practitioners with additional insight as well as a scale of reference values for evaluating RSImod scores in collegiate athletes.


2021 ◽  
Author(s):  
Kathleen Holoyda ◽  
Daniel Donato ◽  
David Magno-Padron ◽  
Andrew Simpson ◽  
Jayant Agarwal

Abstract Background: The rates, severity and consequences of hand and wrist injuries sustained by NCAA athletes are not well characterized. This study describes the epidemiology of hand and wrist injuries among collegiate athletes competing in different divisions. Methods: The National Collegiate Athletic Association Injury Surveillance Program (NCAA-ISP) was accessed for various sports from 2004 to 2015. Data was stratified by hand and wrist injuries sustained, mean loss of activity time following the injury, male and female sport, need for surgery following injury and division. Descriptive statistics were performed to examine the association between sports, event type and gender. P < 0.05 was considered significantResults 103,098 hand and wrist injuries were reported in all NCAA sports from 2004-2015. Male athletes sustained 72,423 injuries (6.01/10,000 athlete events) and female athletes sustained 30,675 injuries (4.13/10,000 athlete events). Division I athletes sustained significantly more injuries compared to divisions II and III. Overall, 3.78% of hand and wrist injuries required surgical intervention. A significantly higher percentage of division I athletes (both male and female) underwent surgical intervention compared to divisions II and III. The mean time lost due to hand and wrist injury was 7.14 days for all athletes. Division I athletes tended to miss fewer days due to injury, though this was not significant.Conclusions Hand and wrist injuries are common among collegiate athletes. Division I athletes sustain higher rates of injuries and higher surgical intervention rates, while tending to miss fewer days due to injury. Improved characterization of divisional differences in hand and wrist injuries can assist injury management and prevention


2021 ◽  
Vol 6 (1) ◽  
pp. 4
Author(s):  
W. Guy Hornsby ◽  
Abigail L. Tice ◽  
Jason D. Stone ◽  
Justin J. Merrigan ◽  
Joshua Hagen ◽  
...  

The purpose of this longitudinal, descriptive study was to observe changes in maximal strength measured via isometric clean grip mid-thigh pull and home runs (total and home runs per game) across three years of training and three competitive seasons for four National Collegiate Athletic Association (NCAA) Division 1 baseball players. A one-way repeated measures analysis of variance (ANOVA) was performed, revealing significant univariate effects of time for peak force (PF) (p = 0.003) and peak force allometrically scaled (PFa) (p = 0.002). Increases in PF were noted from season 1 to season 2 (p = 0.031) and season 3 (p = 0.004), but season 2 was not significantly different than season 3 (p = 0.232). Additionally, increases in PFa were noted from season 1 to season 2 (p = 0.010) and season 3 (p < 0.001), but season 2 was not significantly different than season 3 (p = 0.052). Home runs per game rose from the 2009 (0.32) to 2010 season (1.35) and dropped during the 2011 season (1.07). A unique aspect of the study involves 2010 being the season in which ball-bat coefficient of restitution (BBCOR) bats were introduced to the NCAA competition.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yanzhi Bi ◽  
Xin Hou ◽  
Jiahui Zhong ◽  
Li Hu

AbstractPain perception is a subjective experience and highly variable across time. Brain responses evoked by nociceptive stimuli are highly associated with pain perception and also showed considerable variability. To date, the test–retest reliability of laser-evoked pain perception and its associated brain responses across sessions remain unclear. Here, an experiment with a within-subject repeated-measures design was performed in 22 healthy volunteers. Radiant-heat laser stimuli were delivered on subjects’ left-hand dorsum in two sessions separated by 1–5 days. We observed that laser-evoked pain perception was significantly declined across sessions, coupled with decreased brain responses in the bilateral primary somatosensory cortex (S1), right primary motor cortex, supplementary motor area, and middle cingulate cortex. Intraclass correlation coefficients between the two sessions showed “fair” to “moderate” test–retest reliability for pain perception and brain responses. Additionally, we observed lower resting-state brain activity in the right S1 and lower resting-state functional connectivity between right S1 and dorsolateral prefrontal cortex in the second session than the first session. Altogether, being possibly influenced by changes of baseline mental state, laser-evoked pain perception and brain responses showed considerable across-session variability. This phenomenon should be considered when designing experiments for laboratory studies and evaluating pain abnormalities in clinical practice.


Author(s):  
Kyeongtak Song ◽  
Johanna M. Hoch ◽  
Carolina Quintana ◽  
Nicholas R. Heebner ◽  
Matthew C. Hoch

Author(s):  
Janet E. Simon ◽  
Mallory Lorence ◽  
Carrie L. Docherty

Context The effect of athletic participation on lifelong health among elite athletes has received increasing attention, as sport-related injuries can have a substantial influence on long-term health. Objective To determine the current health-related quality of life (HRQoL) of former National Collegiate Athletic Association Division I athletes compared with noncollegiate athletes 5 years after an initial assessment. Design Cohort study. Setting Online survey. Patients or Other Participants From the former Division I athletes, 193 responses were received (response rate = 83.2%; 128 men, 65 women; age = 58.47 ± 6.17 years), and from the noncollegiate athletes, 169 surveys were returned (response rate = 75.1%; 80 men, 89 women; age = 58.44 ± 7.28 years). Main Outcome Measure(s) The independent variables were time (baseline, 5 years later) and group (former Division I athlete, noncollegiate athlete). Participants completed 7 Patient-Reported Outcomes Measurement Information System scales: sleep disturbance, anxiety, depression, fatigue, pain interference, physical function, and satisfaction with participation in social roles. Results Sleep disturbance, depression, fatigue, pain, and physical function were significant for a time × group interaction (P &lt; .05), with the largest differences seen in pain and physical function between groups at time point 2 (22.19 and 13.99 points, respectively). Former Division I athletes had worse scores for depression, fatigue, pain, and physical function at follow-up (P &lt; .05), with the largest differences seen on the depression, fatigue, and physical function scales (8.33, 6.23, and 6.61 points, respectively). Conclusions Because of the competitive nature of sport, the long-term risks of diminished HRQoL need to become a priority for health care providers and athletes during their athletic careers. Additionally, physical activity transition programs need to be explored to help senior student-athletes transition from highly structured and competitive collegiate athletics to lifestyle physical activity, as it appears that individuals in the noncollegiate athlete cohort engaged in more physical activity, weighed less, and had increased HRQoL.


Sign in / Sign up

Export Citation Format

Share Document