scholarly journals Evaluating Performance of National Hockey League Players After a Concussion Versus Lower Body Injury

2019 ◽  
Vol 54 (5) ◽  
pp. 534-540 ◽  
Author(s):  
Kathryn L. Van Pelt ◽  
Andrew P. Lapointe ◽  
Michelle C. Galdys ◽  
Lauren A. Dougherty ◽  
Thomas A. Buckley ◽  
...  

Context Concussions elicit changes in brain function that may extend well beyond clinical symptom recovery. Whether these changes produce meaningful deficits outside the laboratory environment is unclear. The results of player performance postconcussion within professional sports have been mixed. Objective To determine whether National Hockey League (NHL) players with concussions performed worse after returning to sport than players with lower body injuries or uninjured players. Design Cohort study. Setting Publicly available Web sites that compiled injury and player statistics of NHL players. Patients or Other Participants Male NHL players who missed games due to a concussion (n = 22), lower body injury (n = 21), or noninjury (ie, personal reason or season break; n = 13) during the 2013–2014 and 2014–2015 regular seasons. Data on concussed athletes were used to identify similar players with lower body injury and noninjury based on (1) position, (2) time loss, (3) time on the ice, and (4) team. Main Outcome Measure(s) The primary performance metric was a modified plus-minus statistic calculated by weighting the players' plus-minus metric by their team's simple rating system to account for varying team performances. Linear mixed models assessed the relationship between injury type (concussion, lower body, or noninjury) and performance (plus-minus score). Results We observed a quadratic effect for a time2 × group interaction (\upchi _2^2 = 8.85, P = .01). This interaction revealed that the concussion and lower body injury groups had similar patterns of an initial decrease (ie, 2 weeks after return to play), followed by an increase in performance compared with the uninjured group in weeks 5 and 6. Meanwhile, the uninjured group had an initial increase in performance. We observed no group × linear time interaction (P = .47) or overall group effect (P = .57). Conclusions The NHL players in the concussion and lower body injury groups displayed similar performance impairments. Both injured cohorts experienced an initial decrease in performance at weeks 1 to 2 after return to play, followed by improved performance at weeks 5 to 6 after return to play, suggesting that the performance implications of concussion may be short lived.

1988 ◽  
Vol 255 (1) ◽  
pp. R149-R156 ◽  
Author(s):  
K. Sander-Jensen ◽  
J. Mehlsen ◽  
C. Stadeager ◽  
N. J. Christensen ◽  
J. Fahrenkrug ◽  
...  

Progressive central hypovolemia is characterized by a normotensive, tachycardic stage followed by a reversible, hypotensive stage with slowing of the heart rate (HR). We investigated circulatory changes and arterial hormone concentrations in response to lower-body negative pressure (LBNP) in six volunteers before and after atropine administration. LBNP of 55 mmHg initially resulted in an increase in HR from 55 +/- 4 to 90 +/- 5 beats/min and decreases in mean arterial pressure (MAP) from 94 +/- 4 to 81 +/- 5 mmHg, in central venous pressure from 7 +/- 1 to -3 +/- 1 mmHg, and in cardiac output from 6.1 +/- 0.5 to 3.7 +/- 0.11/min. Concomitantly, epinephrine and norepinephrine levels increased. After 8.2 +/- 2.3 min of LBNP, the MAP had decreased to 41 +/- 7 mmHg and HR had decreased to 57 +/- 3 beats/min. Vasopressin increased from 1.2 +/- 0.3 to 137 +/- 45 pg/ml and renin activity increased from 1.45 +/- 4.0 to 3.80 +/- 1.0 ng.ml-1.h-1 with no further changes in epinephrine, norepinephrine, and vasoactive intestinal polypeptide. A tardy rise in pancreatic polypeptide indicated increased vagal activity. After atropine. LBNP also caused an initial increase in HR, which, however, remained elevated during the subsequent decrease in MAP to 45 +/- 6 mmHg occurring after 8.1 +/- 2.4 min.(ABSTRACT TRUNCATED AT 250 WORDS)


2012 ◽  
Vol 41 (1) ◽  
pp. 107-110 ◽  
Author(s):  
Andre Jakoi ◽  
Craig O’Neill ◽  
Christopher Damsgaard ◽  
Keith Fehring ◽  
James Tom

Background: Athletic pubalgia is a complex injury that results in loss of play in competitive athletes, especially hockey players. The number of reported sports hernias has been increasing, and the importance of their management is vital. There are no studies reporting whether athletes can return to play at preinjury levels. Purpose: The focus of this study was to evaluate the productivity of professional hockey players before an established athletic pubalgia diagnosis contrasted with the productivity after sports hernia repair. Study Design: Cohort study; Level of evidence, 3. Methods: Professional National Hockey League (NHL) players who were reported to have a sports hernia and who underwent surgery from 2001 to 2008 were identified. Statistics were gathered on the players’ previous 2 full seasons and compared with the statistics 2 full seasons after surgery. Data concerning games played, goals, average time on ice, time of productivity, and assists were gathered. Players were divided into 3 groups: group A incorporated all players, group B were players with 6 or fewer seasons of play, and group C consisted of players with 7 or more seasons of play. A control group was chosen to compare player deterioration or improvement over a career; each player selected for the study had a corresponding control player with the same tenure in his career and position during the same years. Results: Forty-three hockey players were identified to have had sports hernia repairs from 2001 to 2008; ultimately, 80% would return to play 2 or more full seasons. Group A had statistically significant decreases in games played, goals scored, and assists. Versus the control group, the decreases in games played and assists were supported. Statistical analysis showed significant decreases in games played, goals scored, assists, and average time on ice the following 2 seasons in group C, which was also seen in comparison with the control group. Group B (16 players) showed only statistical significance in games played versus the control group. Conclusion: Players who undergo sports hernia surgeries return to play and often perform similar to their presurgery level. Players with over 7 full seasons return but with significant decreases in their overall performance levels. Less veteran players were able to return to play without any statistical decrease in performance and are likely the best candidates for repair once incurring injury.


2017 ◽  
Vol 10 (2) ◽  
pp. 169-174 ◽  
Author(s):  
Moin Khan ◽  
Kim Madden ◽  
M. Tyrrell Burrus ◽  
Joseph P. Rogowski ◽  
Jeff Stotts ◽  
...  

Background: Professional basketball players in the National Basketball Association (NBA) subject their lower extremities to significant repetitive loading during both regular-season and off-season training. Little is known about the incidence of lower extremity bony stress injuries and their impact on return to play and performance in these athletes. Hypothesis: Stress injuries of the lower extremity will have significant impact on performance. Study Design: Case series. Level of Evidence: Level 4. Methods: All bony stress injuries from 2005 to 2015 were identified from the NBA. Number of games missed due to injury and performance statistics were collected from 2 years prior to injury to 2 years after the injury. A linear regression analysis was performed to determine the impact of injury for players who returned to sport. Results: A total of 76 lower extremity bony stress injuries involving 75 NBA players (mean age, 25.4 ± 4.1 years) were identified. Fifty-five percent (42/76) involved the foot, and most injuries occurred during the regular season (82.9%, 63/76), with half occurring within the first 6 weeks. Among players who sustained a fifth metatarsal stress fracture, 42.9% were unable to return to professional play. Players who sustained stress injuries had reduced play performance, specifically related to number of games played ( P = 0.014) and number of steals per game ( P = 0.004). Players who had surgery had significantly better performance at 2 years than those who were managed nonoperatively, independent of the type of injury (β = 4.561; 95% CI, 1.255-7.868). Conclusion: Lower extremity bony stress injuries may significantly affect both short- and long-term player performance and career length. Stress injuries result in decreased player performance, and surgical intervention results in improved performance metrics compared with those treated using conservative methods. Clinical Relevance: Stress injuries result in decreased player performance, and surgical intervention results in improved performance metrics.


Neurology ◽  
2019 ◽  
Vol 93 (14 Supplement 1) ◽  
pp. S13.2-S14
Author(s):  
Adam Harrison ◽  
Steven Broglio ◽  
R. Davis Moore ◽  
Andrew Lapointe ◽  
Michael McCrea

ObjectiveLongitudinally assess recovery following concussion in male athletes with fhNDD.BackgroundResearch suggests that a family history of neurodegenerative disease (fhNDD) may predispose an athlete to abnormal recovery following brain injury. However, no one has longitudinally assessed recovery following concussion in male athletes with fhNDD.Design/MethodsData from the NCAA-DOD Grand Alliance: Concussion Assessment, Research, and Education (CARE) Consortium were used to compared male athletes with (n = 51) and without (n = 102) a family history of neurodegenerative disease (Parkinson’s, Alzheimer’s, Non-Alzheimer’s Dementia, and Mild Cognitive Impairment). All athletes completed baseline ImPACT assessments prior to the beginning of their sporting season. Athletes that sustained a concussion were then re-evaluated 24-48 hours post-injury, prior to un-restricted return to play (RTP), and again 6 months post-injury. Athletes without fhNDD were double matched based on age, body mass index, sport category, and concussion history.ResultsRepeated measures ANCOVA models were used to evaluate performance at each post-injury timepoint, while controlling for baseline performance. A group × time interaction was observed for visual memory performance. Post-hoc univariate analyses revealed that male athletes with fhNDD demonstrated significantly poorer visual memory performance 24–48 hours post-injury compared to controls (p ≤ 0.005). Additionally, we found a main effect of group for impulse control, indicating that male athletes with fhNDD demonstrated an increase number of impulse errors at all three post-injury evaluations (p ≤ 0.004). We did not to observe any other group differences (p’s > 0.05).ConclusionsOur results suggest that male athletes with a family history of neurodegenerative disease may exhibit greater post-injury cognitive deficits compared to controls. Additionally, some deficits may persist for at least 6 months post-injury. Further research is warranted to investigate the interaction between family history of neurodegenerative disease and concussion.


2018 ◽  
Vol 13 (6) ◽  
pp. 789-794 ◽  
Author(s):  
Neil D. Clarke ◽  
Darren L. Richardson ◽  
James Thie ◽  
Richard Taylor

Context: Caffeine, often in the form of coffee, is frequently used as a supplement by athletes in an attempt to facilitate improved performance during exercise. Purpose: To investigate the effectiveness of coffee ingestion as an ergogenic aid prior to a 1-mile (1609 m) race. Methods: In a double-blind, randomized, cross-over, and placebo-controlled design, 13 trained male runners completed a 1-mile race 60 minutes following the ingestion of 0.09 g·kg−1 coffee (COF), 0.09 g·kg−1 decaffeinated coffee (DEC), or a placebo (PLA). All trials were dissolved in 300 mL of hot water. Results: The race completion time was 1.3% faster following the ingestion of COF (04:35.37 [00:10.51] min:s.ms) compared with DEC (04:39.14 [00:11.21] min:s.ms; P = .018; 95% confidence interval [CI], −0.11 to −0.01; d = 0.32) and 1.9% faster compared with PLA (04:41.00 [00:09.57] min:s.ms; P = .006; 95% CI, −0.15 to −0.03; d = 0.51). A large trial and time interaction for salivary caffeine concentration was observed (P < .001; ), with a very large increase (6.40 [1.57] μg·mL−1; 95% CI, 5.5–7.3; d = 3.86) following the ingestion of COF. However, only a trivial difference between DEC and PLA was observed (P = .602; 95% CI, −0.09 to 0.03; d = 0.17). Furthermore, only trivial differences were observed for blood glucose (P = .839; ) and lactate (P = .096; ) and maximal heart rate (P = .286; ) between trials. Conclusions: The results of this study show that 60 minutes after ingesting 0.09 g·kg−1 of caffeinated coffee, 1-mile race performance was enhanced by 1.9% and 1.3% compared with placebo and decaffeinated coffee, respectively, in trained male runners.


2017 ◽  
Vol 12 (5) ◽  
pp. 634-641 ◽  
Author(s):  
Dean Ritchie ◽  
Will G. Hopkins ◽  
Martin Buchheit ◽  
Justin Cordy ◽  
Jonathan D. Bartlett

Context:Training volume, intensity, and distribution are important factors during periods of return to play.Purpose:To quantify the effect of injury on training load (TL) before and after return to play (RTP) in professional Australian Rules football.Methods:Perceived training load (RPE-TL) for 44 players was obtained for all indoor and outdoor training sessions, while field-based training was monitored via GPS (total distance, high-speed running, mean speed). When a player sustained a competition time-loss injury, weekly TL was quantified for 3 wk before and after RTP. General linear mixed models, with inference about magnitudes standardized by between-players SDs, were used to quantify effects of lower- and upper-body injury on TL compared with the team.Results:While total RPE-TL was similar to the team 2 wk before RTP, training distribution was different, whereby skills RPE-TL was likely and most likely lower for upper- and lower-body injury, respectively, and most likely replaced with small to very large increases in running and other conditioning load. Weekly total distance and high-speed running were most likely moderately to largely reduced for lower- and upper-body injury until after RTP, at which point total RPE-TL, training distribution, total distance, and high-speed running were similar to the team. Mean speed of field-based training was similar before and after RTP compared with the team.Conclusions:Despite injured athletes’ obtaining comparable TLs to uninjured players, training distribution is different until after RTP, indicating the importance of monitoring all types of training that athletes complete.


1990 ◽  
Vol 112 (1) ◽  
pp. 27-34 ◽  
Author(s):  
A. Haraldsdottir ◽  
P. T. Kabamba ◽  
A. G. Ulsoy

This paper presents a design procedure for linear time invariant systems using output proportional plus derivative feedback. The derivative feedback is shown to improve the controller performance in the presence of parameter variations and disturbance inputs, at the cost of increased noise response. A quadratic performance index, with the addition of terms to penalize disturbance and noise response and eigenvalue and response sensitivity, is used as the basis for a design method. The proposed method is a generalization of one presented previously for the case of perfect state and state derivative feedback. The method is illlustrated on a simple first order example and on the design of a controller for the lateral dynamnics of an L1011 aircraft. The results indicate that improved performance is obtained through the addition of perfect output derivative feedback, however, much of that improvement is lost when approximate differentiation is used.


2020 ◽  
Vol 4 (4) ◽  
pp. 786-791
Author(s):  
Hasani W. Swindell ◽  
Kyle L. McCormick ◽  
Liana J. Tedesco ◽  
Carl L. Herndon ◽  
Christopher S. Ahmad ◽  
...  

2021 ◽  
Vol 3 (1) ◽  
pp. 95-122
Author(s):  
Kilho Shin ◽  
Taichi Ishikawa ◽  
Yu-Lu Liu ◽  
David Lawrence Shepard

The subpath kernel is a class of positive definite kernels defined over trees, which has the following advantages for the purposes of classification, regression and clustering: it can be incorporated into a variety of powerful kernel machines including SVM; It is invariant whether input trees are ordered or unordered; It can be computed by significantly fast linear-time algorithms; And, finally, its excellent learning performance has been proven through intensive experiments in the literature. In this paper, we leverage recent advances in tree kernels to solve real problems. As an example, we apply our method to the problem of detecting fake e-commerce sites. Although the problem is similar to phishing site detection, the fact that mimicking existing authentic sites is harmful for fake e-commerce sites marks a clear difference between these two problems. We focus on fake e-commerce site detection for three reasons: e-commerce fraud is a real problem that companies and law enforcement have been cooperating to solve; Inefficiency hampers existing approaches because datasets tend to be large, while subpath kernel learning overcomes these performance challenges; And we offer increased resiliency against attempts to subvert existing detection methods through incorporating robust features that adversaries cannot change: the DOM-trees of web-sites. Our real-world results are remarkable: our method has exhibited accuracy as high as 0.998 when training SVM with 1000 instances and evaluating accuracy for almost 7000 independent instances. Its generalization efficiency is also excellent: with only 100 training instances, the accuracy score reached 0.996.


Author(s):  
Zhenqiu Liu ◽  
Kelin Xu ◽  
Yanfeng Jiang ◽  
Ning Cai ◽  
Jiahui Fan ◽  
...  

Abstract Background Predictions of primary liver cancer (PLC) incidence rates and case numbers are critical to understand and plan for PLC disease burden. Methods Data on PLC incidence rates and case numbers from 1990 to 2017 were retrieved from the Global Burden of Disease database. The estimated average percentage change (EAPC) was calculated to quantify the trends of PLC age-standardized incidence rates (ASRs). Bayesian age-period-cohort models were constructed to project PLC incidence rates and case numbers through 2030. Results Globally, the PLC case number doubled from 472 300 in 1990 to 953 100 in 2017. The case number will further increase to 1 571 200 in 2030, and the ASR will increase from 11.80 per 100 000 in 2018 to 14.08 per 100 000 in 2030. The most pronounced increases are observed in people afflicted by non-alcoholic steatohepatitis (NASH) and in older people. The trends of PLC incidence rates between 1990 and 2030 are heterogeneous among countries and can be summarized as five scenarios: (i) 46 countries that have and will continue to experience a persistent increase (e.g. Australia); (ii) 21 countries that experienced an initial decrease (or remained stable) but are predicted to increase (e.g. China); (iii) 7 countries that experienced an initial increase but are predicted to remain stable (e.g. USA); (iv) 29 countries that experienced an initial increase but are predicted to decrease (e.g. Egypt); and (v) 82 countries that have and will continue to experience a persistent decrease (e.g. Japan). Conclusion PLC incidence rates and case numbers are anticipated to increase at the global level through 2030. The increases in people afflicted by NASH and among older people suggest a dearth of attention for these populations in current prevention strategies and highlight their priority in future schedules for global control of PLC.


Sign in / Sign up

Export Citation Format

Share Document