scholarly journals The Effect of Concussive Injury on Individual Game Performance in Professional Collision-Sport Athletes

2019 ◽  
Vol 28 (7) ◽  
pp. 769-773
Author(s):  
Corey P. Ochs ◽  
Melissa C. Kay ◽  
Johna K. Register-Mihalik

Clinical Scenario: Collision sports are often at higher risk of concussion due to the physical nature and style of play. Typically, initial clinical recovery occurs within 7 to 10 days; however, even this time frame may result in significant time lost from play. Little has been done in previous research to analyze how individual game performance may be affected upon return to play postconcussion. Focused Clinical Question: Upon return-to-play clearance, how does sport-related concussion affect game performance of professional athletes in collision sports? Summary of Key Findings: All 3 studies included found no significant change in individual performance of professional collision-sport athletes upon returning to play from concussive injury. One of the studies indicated that there was no difference in performance for NFL athletes who did not miss a single game (returned within 7 d) and those who missed at least 1 game. One study indicated that although there was no change in performance of NFL players upon returning to play from sustained concussion, there was a decline in performance in the 2 weeks before the diagnosed injury and appearing on the injury report. The final study indicated that there was no difference in performance or style of play of NHL athletes who missed time due to concussive injury when compared with athletes who missed games for a noninjury factor. Clinical Bottom Line: There was no change in performance upon return from concussive injury suggesting that players appear to be acutely recovered from the respective concussion before returning to play. This suggests that current policies and management properly evaluate and treat concussed athletes of these professional sports. Strength of Recommendation: Grade C evidence exists that there is no change in individual game performance in professional collision-sport athletes before and after suffering a concussion.

2019 ◽  
Vol 7 (7_suppl5) ◽  
pp. 2325967119S0040
Author(s):  
Kelechi Okoroha ◽  
Bhavik H. Patel ◽  
Yining Lu ◽  
Alexander J. Idarraga ◽  
Brian Forsythe

Objectives: Several studies have examined the incidence and effects of concussions in professional football, baseball, and hockey, but there has been limited evaluation of the effects of concussions in National Basketball Association (NBA) players. This study aims to evaluate the epidemiologic trends of concussions, as well as the effects of concussions on in-game performance, in NBA players. Methods: Publicly available records were searched to include all players who sustained an in-game concussion while playing in the NBA from the beginning of the 1999 NBA season to the conclusion of the 2018 season. For each player the following variables were collected: date of injury, number of days and games missed before returning to game play, player efficiency rating (PER) in the season of injury, the season preceding the injury, and the season following the injury, position of the injured player, and the incidence of multiple concussions for a single player. Concussion trends before and after the institution of the NBA Concussion Protocol were calculated, as well as the effects on PER after return to play. Results: From the start of the 1999 season to the end of the 2018 season, 185 basketball-related concussions were incurred across 149 NBA players. All players were able to return to play following a first-time concussion after missing an average of 7.7 days and 3.5 games. The NBA Concussion Protocol was instituted ahead of the 2011-2012 season, prior to which there were 5.7 concussions recorded/season, with an average of 6.7 days and 3.0 games missed per first-time concussion. Following the institution of the concussion protocol, there were approximately 11 more concussions recorded/season (16.7 vs. 5.7, P = 0.007), with 1.7 more days missed (8.4 vs. 6.7, P = 0.27) and 0.9 more games missed (3.9 vs. 3.0, P = 0.24) per concussion, compared to prior seasons. Of the 149 players who suffered concussions, 27 were concussed multiple times (18.1%). There was no difference found in the incidence of recurrent concussions within the same season before vs. after the institution of the concussion protocol (4 vs. 5, P > 0.05). PER was almost identical for concussed players in the season prior to the injury, the season in which the injury occurred, and the season following the injury (13.93 vs. 13.94 vs. 13.91, P = 0.998). Conclusion: There has been a significant increase in the incidence of concussions in the NBA player following the institution of a league-wide concussion protocol. This likely reflects more accurate reporting secondary to advances in player education, medical knowledge, national media coverage, and standardized testing protocols. Despite this increase in reported concussions, the amount of time missed following injury has remained relatively constant. Player performance as reported by PER was not significantly affected by sustaining a concussion. [Figure: see text][Table: see text]


2019 ◽  
Vol 47 (11) ◽  
pp. 2717-2722 ◽  
Author(s):  
Toufic R. Jildeh ◽  
Kelechi R. Okoroha ◽  
Kevin A. Taylor ◽  
Patrick Buckley ◽  
Samir Mehta ◽  
...  

Background: Concussion injuries are common in professional football players; however, their effect on player performance remains unclear. Purpose: To quantify the effect of concussions on the performance of running backs and wide receivers in professional football players. Study Design: Cohort study; Level of evidence, 3. Methods: Concussion data from the National Football League were collected for a period of 4 seasons (2012-2015) for running backs and wide receivers. Age, experience, position, time to return to play, yearly total yards, and touchdowns were recorded. A power rating (total yards divided by 10 plus touchdowns multiplied by 6) was calculated for each player’s injury season as well as for the 3 seasons before and after their respective injury. A control group of running backs and wide receivers without an identified concussion injury who competed in the 2012 season was assembled for comparison. Player performance up to 3 seasons before and after the injury season was examined to assess acute and longitudinal changes in player performance. Results: A total of 38 eligible running backs and wide receivers sustained a concussion during the study period. Thirty-four (89%) players were able to return to competition in the same season, missing an average of 1.5 ± 0.9 games; the remaining 4 players returned in the subsequent season. Power ratings for concussed players were similar to those of controls throughout the study period. Concussed players did not suffer an individual performance decline upon returning within the same season. Furthermore, no significant difference in change of power rating was observed in concussed players in the acute (±1 year from injury; −1.2 ± 4.8 vs –1.1 ± 3.9, P = .199) or chronic (±3 years from injury; –3.6 ± 8.0 vs –3.0 ± 4.5, P = .219) setting compared with controls. All concussed players successfully returned to competition in either the index or next season. Conclusion: A high rate of National Football League running backs and wide receivers are able to return to play after a concussion injury. These players were found to perform at a similar level in both the acute and long-term period after concussion.


2021 ◽  
Vol 9 (3) ◽  
pp. 232596712110045
Author(s):  
Nicholas J. Vaudreuil ◽  
Amy J. Kennedy ◽  
Stephen J. Lombardo ◽  
F. Daniel Kharrazi

Background: The global pandemic caused by COVID-19 has had far-reaching implications for the world of professional sports. The National Basketball Association (NBA) suspended active regular season play in 2020 after a player tested positive for SARS-CoV-2. No previous studies have examined the impact of COVID-19 on return to play in the NBA. Purpose/Hypothesis: The purpose of this study was to examine performance measures for NBA players who had recovered from COVID-19 and returned to play in the NBA bubble. We hypothesized that these athletes would play fewer minutes and have decreased performance statistics compared with performance during the 2019-2020 regular season prior to the lockdown and with career averages. Study Design: Case series; Level of evidence, 4. Methods: NBA players positive for SARS-CoV-2 who played in both the 2019-2020 regular season and the NBA bubble were identified. Data collected included player demographics and player performance statistics. Results: A total of 20 players were included in the study. Players who had recovered from COVID-19 played significantly fewer minutes per game in the NBA bubble (25.8 vs 28.7; P = .04) and made fewer field goals per game (4.6 vs 5.4; P = .02) compared with the season prior to shutdown. While NBA bubble players demonstrated slight decreases in averages for points ( P = .06), rebounds ( P =.13), assists ( P = .23), steals ( P = .30), and blocks ( P = .71) per game, these were not statistically significant. Aside from an increase in made free throws per game during the bubble (3.3 vs 2.8; P = .04), player performance was not significantly different from career averages. Conclusion: For players who tested positive for SARS-CoV-2 prior to playing in the NBA bubble, the current study demonstrated that despite playing significantly fewer minutes per game, performance was not statistically different from either their pre-COVID 2019-2020 level of play or from their career averages.


2017 ◽  
Vol 33 (5) ◽  
pp. 309-316
Author(s):  
Daniella Schweizer ◽  
Débora Cristina Rother ◽  
Ana Elena Muler ◽  
Ricardo Ribeiro Rodrigues ◽  
Marco Aurélio Pizo ◽  
...  

Abstract:A comprehensive assessment of the effect of disturbances on tropical and subtropical forests is needed to better understand their impacts on forest structure and diversity. Although taxonomic and functional diversity measures have been successfully adopted in this context, phylogenetic diversity metrics are still poorly explored. We compared the phylogenetic structure of the seed rain and regenerating seedling community in patches of an old-growth Atlantic Forest remnant dominated or not by a ruderal bamboo species, Guadua tagoara. We sampled those patches before and after illegal harvesting of the palm Euterpe edulis thus assessing if the harvesting led to changes in the phylogenetic structure of the seed rain and the regenerating community in both patches. Bamboo-dominated patches showed a significantly higher presence of species in the seed rain that were more distantly related to each other in the phylogeny than expected by chance compared with patches without bamboos, but this difference disappeared after palm-heart harvesting. Contrary to what we expected, we did not find significant changes in the phylogenetic structure of seedlings before or after palm-heart harvesting. The phylogenetic structure at the tips of the phylogeny was random overall. The maintenance of a higher presence of far relatives in the phylogeny of the seedling community suggests, assuming trait conservatism, that despite bamboo dominance and palm-heart harvesting, functional diversity is being preserved at least in the early regenerating stages and in the time frame of the study. However, higher presence of pioneer taxa after palm-heart harvest indicates that this disturbance may lead old-growth areas to earlier successional stages.


2019 ◽  
Vol 28 (8) ◽  
pp. 902-905
Author(s):  
Matt Hausmann ◽  
Jacob Ober ◽  
Adam S. Lepley

Clinical Scenario: Ankle sprains are the most prevalent athletic-related musculoskeletal injury treated by athletic trainers, often affecting activities of daily living and delaying return to play. Most of these cases present with pain and swelling in the ankle, resulting in decreased range of motion and strength deficits. Due to these impairments, proper treatment is necessary to avoid additional loss of play and prevent future injuries. Recently, there has been an increased use of deep oscillation therapy by clinicians to manage pain and swelling following a variety of injuries, including ankle sprains. However, very little evidence has been produced regarding the clinical effectiveness of deep oscillation therapy, limiting its application in therapeutic rehabilitation of acute lateral ankle sprains. Clinical Question: Is deep oscillation therapy effective in reducing pain and swelling in patients with acute lateral ankle sprains compared with the current standard of care protection, rest, ice, compression, and elevation? Summary of Key Findings: The literature was searched for studies of level 2 evidence or higher that investigated deep oscillation therapy on pain and inflammation in patients with lateral ankle sprains. Three randomized control trials were located and appraised. One of the 3 studies demonstrate a reduction in pain following 6 weeks of deep oscillation therapy compared with the standard of care or placebo interventions. The 2 other studies, 1 utilizing a 5-day treatment and the other a 1 time immediate application, found no differences in deep oscillation therapy compared with the standard of care. Clinical Bottom Line: There is inconclusive evidence to support the therapeutic use of deep oscillation therapy in reducing pain and swelling in patients with acute lateral ankle sprains above and beyond the current standard of care. In addition, the method of treatment application and parameters used may influence the effectiveness of deep oscillation therapy. Strength of Recommendation: Level B.


Author(s):  
Aoibhinn Ni Shuilleabhain ◽  
Anthony Cronin ◽  
Mark Prendergast

Abstract In this paper we explore the attitudes of under-privileged secondary school pupils in Ireland towards mathematics and investigate the impact of attending a 4-week engagement programme on these attitudes. The pupils involved in this research attended schools recognized by the Department of Education & Skills as socio-economically deprived. Pupils attending these schools, known as Delivering Equality of Opportunity in Schools (DEIS), are 40% less likely than their counterparts in non-DEIS schools to pursue mathematics at a higher level in state examinations (Smyth, E., Mccoy, S. & Kingston, G., 2015, Learning From the Evaluation of DEIS. Dublin: Economic and Social Research Institute). However, little research has reported on these pupils’ experiences of and attitudes towards mathematics at senior secondary level. An engagement programme entitled ‘Maths Sparks’ was purposefully designed for secondary pupils from DEIS schools, with the aim of positively influencing their attitudes towards and confidence in mathematics. The programme consisted of weekly out-of-school workshops exploring extra-curricular mathematics topics, designed and delivered by undergraduate mathematics students. Questionnaires were utilized to evaluate pupils’ attitudes towards mathematics before and after their participation in the programme. Despite its relatively short time frame, qualitative and quantitative analysis suggests an increase in participating pupils’ attitudes towards, enjoyment of and self-confidence in mathematics due to their participation in the programme. Findings also suggest that while these pupils liked the subject of mathematics, their experience of learning the subject in school was not always positive and was sometimes hindered by the absence of higher-level mathematics as an option in school. The high-stakes examination content and teachers’ beliefs in the ability of their students also sometimes negatively impacted learners’ intentions to pursue mathematics at a higher level. Findings suggest that longitudinal mathematics engagement programmes, which focus on problem solving, involve extra-curricular mathematical concepts and are presented by undergraduate mathematics students, may provide a valuable way of positively impacting pupils’ intentions to pursue the subject.


2018 ◽  
Vol 6 (7_suppl4) ◽  
pp. 2325967118S0008 ◽  
Author(s):  
Jeffrey R. Dugas ◽  
Christopher A. Looze ◽  
Christopher Michael Jones ◽  
Brian L. Walters ◽  
Marcus A. Rothermich ◽  
...  

Objectives: There has been a renewed interest in UCL repair in overhead athletes. This is largely due to greater understanding of UCL pathology, improvement in fixation technology and the extensive rehab required to return from UCL reconstruction. Initial data regarding UCL repair in overhead athletes was poor and therefore UCL repair was largely abandoned in favor of reconstruction. However, recent literature examining UCL repair with anchor only fixation demonstrated an excellent rate of return to play, reduced time to return to play and a low complication rate. Based on this promising data, we have developed a novel technique of UCL repair with internal brace augmentation that we have used in overhead throwing athletes. We performed a prospective study evaluating the outcomes of this procedure with respect to return to play, time to return to play, functional outcome score and complications. Methods: Overhead athletes undergoing UCL repair with internal brace augmentation were prospectively followed for a minimum of one year. Patients were carefully selected from those who would traditionally be considered for UCL reconstruction. Initially, patients were considered if they had an avulsion of the UCL with otherwise healthy UCL tissue and had a vested interest in shortened rehab. As the study progressed, interest in shortened rehab became a less stringent criteria. Demographic and operative data were collected at the time surgery. This data was compiled for both desciption and comparison between subgroups. Patients were then contacted 1 year postoperatively and assessed for return to play, time to return to play and KJOC scores. Complications were documented and patients having complications were detailed. Results: 66 overhead athletes underwent UCL repair with internal brace augmentation during the study period. 8 were lost to follow up, leaving 58 athletes included in the study. Average age at the time of surgery was 17.9 years old. There were 43 baseball pitchers, 8 baseball position players, 4 softball players, 2 football quaterbacks, and 1 javelin thrower. 96% (54/56) of those who desired to return to the same or higher level of competition were able to do so at an average time of 6.1 months (range 3.2-12 months). 65% of these were able to return in less than 6 months. Many of those who took longer than 6 months did so due to timing within the season. Average KJOC score was 90.2 at 1-year follow-up. 3 patients required return to the operating room, 2 of which were eventually able to return to their previous level of play. There was 1 late failure over 3 years from the index procedure. Comparative subgroup data is presented in table 1. Conclusion: UCL repair with internal brace augmentation is a viable option for overhead throwers with selected UCL pathology who wish to return to sport in a shorter time frame than allowed by traditional UCL reconstruction. [Table: see text]


Neurology ◽  
2019 ◽  
Vol 93 (14 Supplement 1) ◽  
pp. S25.1-S25
Author(s):  
Taylor Susa ◽  
Marguerite Moore ◽  
Joshua Carlson

ObjectiveThis study analyzed MRI and serum samples from 30 participants across two groups to explore the relationship between protein levels and MRI scans in post return-to-play collegiate athletes following concussion.BackgroundRecently, there has been an increase in concussion research on their effects on different protein levels in serum (a derived portion of blood) between concussed and control groups. Recent research examining serum biomarkers in concussion have found elevated levels of many proteins, but overall have mixed results in correlation with MRI. However, these studies have not focused on the lingering effects that exist in post return-to-play.Design/MethodsThe first group (n = 15) consisted of recently cleared to return-to-play collegiate athletes after experiencing a sports-related concussion. The second group (n = 15) was collegiate athlete controls matched on age, sex, and sport. Serum samples were collected to assess the levels of proteins following post return-to-play. These proteins were evaluated using Enzyme-Linked Immunosorbent Assay kits (ELIASA).ResultsAn overall BDNF effect was observed between groups (p < 0.05), the concussed group exhibited significantly higher levels of serum BDNF compared to the control group. A positive association between BDNF and gray matter volume (GMV) was observed at a 250 voxel cluster level in both the right (pFDR = 0.015) and left cerebellum region (pFDR = 0.045) across groups. A negative association between BDNF and GMV in both groups was observed in the brainstem (p = 0.029) and the precuneus (p = 0.017) areas. A differential relationship between group and BDNF on GMV was observed (p = 0.022) in the prefrontal cortex.ConclusionsPrevious research has not examined the post return-to-play effects in neuroplasticity specific proteins, nor the time frame of injury in comparison to controls with MRI. Serum-based biomarkers and MRI grant a better depiction of what is occurring during post return-to-play.


2019 ◽  
Vol 47 (3) ◽  
pp. 713-720 ◽  
Author(s):  
Nathan E. Marshall ◽  
Robert Keller ◽  
Orr Limpisvasti ◽  
Brian Schulz ◽  
Neal ElAttrache

Background: Return to play and player satisfaction have been quite high after ulnar collateral ligament reconstruction (UCLR); however, there has been little reported on how outcomes are affected by surgical technique, graft type, and tear characteristics. Purpose: To evaluate surgical techniques, graft type, and tear characteristics on Major League Baseball (MLB) performance after UCLR. Study Design: Cohort study; Level of evidence, 2. Methods: MLB pitchers who underwent primary UCLR at a single institution were included. Tear characteristics included tear location, tear grade, and acuity. Surgical technique and graft type were also collected. Pitching performance statistics, including earned run average (ERA), walks and hits per innings pitched (WHIP), innings pitched, and fastball velocity were evaluated 3 years before and after UCLR. Results: Forty-six MLB pitchers were identified as having primary UCLR. Return to play was 96%, with 82% returning to MLB play. Technique performed showed no difference in performance. As compared with pitchers with gracilis grafts, pitchers with palmaris grafts were younger ( P = .043), played longer after surgery ( P = .012), and returned to play at 100% (35 of 35) versus 82% (9 of 11, P = .010). When compared with pitchers with proximal tears, pitchers with distal tears pitched at higher velocity (93.0 vs 90.6 mph, P = .023) and had better performance before surgery (ERA, P = .003; WHIP, P = .021); however, those with proximal tears improved to match this performance and velocity after reconstruction. As compared with those having partial tears, pitchers with complete tears played longer after surgery (5.9 vs 4.0 years, P = .033), had a better ERA before injury ( P = .041), and had better WHIP ( P = .037) and strikeouts per 9 innings ( P = .025) after reconstruction. Pitchers with chronic tears had a significant improvement in postoperative ERA, from 4.49 to 3.80 ( P = .040). Conclusion: Technique performed and graft type used did not affect performance; however, pitchers with palmaris grafts returned at a higher rate than those with gracilis grafts. Distal tears occurred in pitchers with greater velocity and better performance before injury, yet pitchers with proximal tears matched this performance after reconstruction. Pitchers with complete tears played longer after reconstruction. Pitchers who had partial tears had worse performance before injury and after reconstruction, and those with chronic tears saw a significant improvement in ERA with reconstruction.


2014 ◽  
Vol 13 (1) ◽  
pp. 82-89 ◽  
Author(s):  
Sara Anne Wilkins ◽  
Chevis N. Shannon ◽  
Steven T. Brown ◽  
E. Haley Vance ◽  
Drew Ferguson ◽  
...  

Object Recent legislation and media coverage have heightened awareness of concussion in youth sports. Previous work by the authors' group defined significant variation of care in management of children with concussion. To address this variation, a multidisciplinary concussion program was established based on a uniform management protocol, with emphasis on community outreach via traditional media sources and the Internet. This retrospective study evaluates the impact of standardization of concussion care and resource utilization before and after standardization in a large regional pediatric hospital center. Methods This retrospective study included all patients younger than 18 years of age evaluated for sports-related concussion between January 1, 2007, and December 31, 2011. Emergency department, sports medicine, and neurosurgery records were reviewed. Data collected included demographics, injury details, clinical course, Sports Concussion Assessment Tool-2 (SCAT2) scores, imaging, discharge instructions, and referral for specialty care. The cohort was analyzed comparing patients evaluated before and after standardization of care. Results Five hundred eighty-nine patients were identified, including 270 before standardization (2007–2011) and 319 after standardization (2011–2012). Statistically significant differences (p < 0.0001) were observed between the 2 groups for multiple variables: there were more girls, more first-time concussions, fewer initial presentations to the emergency department, more consistent administration of the SCAT2, and more consistent supervision of return to play and return to think after adoption of the protocol. Conclusions A combination of increased public awareness and legislation has led to a 5-fold increase in the number of youth athletes presenting for concussion evaluation at the authors' center. Establishment of a multidisciplinary clinic with a standardized protocol resulted in significantly decreased institutional resource utilization and more consistent concussion care for this growing patient population.


Sign in / Sign up

Export Citation Format

Share Document