scholarly journals Injury Risk Associated With Sports Specialization and Activity Volume in Youth

2019 ◽  
Vol 7 (9) ◽  
pp. 232596711987012 ◽  
Author(s):  
Alison E. Field ◽  
Frances A. Tepolt ◽  
Daniel S. Yang ◽  
Mininder S. Kocher

Background: Sports specialization has become increasingly common among youth. Purpose/Hypothesis: To investigate the relative importance of specialization vs volume of activity in increasing risk of injury. Hypotheses were that specialization increases the risk of injury and that risk varies by sport. Study Design: Cohort study; Level of evidence, 2. Methods: A prospective analysis was conducted with data collected from 10,138 youth in the Growing Up Today Study—a prospective cohort study of youth throughout the United States—and their mothers. Activity was assessed via questionnaires in 1997, 1998, 1999, and 2001. Sports specialization was defined as engaging in a single sport in the fall, winter, and spring. Injury history was provided by participants’ mothers via questionnaire in 2004. The outcome was incident stress fracture, tendinitis, chondromalacia patella, anterior cruciate ligament tear, or osteochondritis dissecans or osteochondral defect. Results: Females who engaged in sports specialization were at increased risk of injury (hazard ratio [HR], 1.31; 95% CI, 1.07-1.61), but risk varied by sport. Sports specialization was associated with greater volume of physical activity in both sexes ( P < .0001). Total hours per week of vigorous activity was predictive of developing injury, regardless of what other variables were included in the statistical model (males: HR, 1.04; 95% CI, 1.02-1.06; females: HR, 1.06; 95% CI, 1.05-1.08). Among females, even those engaging in 3 to 3.9 hours per week less than their age were at a significantly increased risk of injury (HR, 1.93; 95% CI, 1.34-2.77). In males, there was no clear pattern of risk. Conclusion: Sports specialization is associated with a greater volume of vigorous sports activity and increased risk of injury. Parents, coaches, and medical providers need to be made aware of the volume threshold above which physical activity is excessive.

2019 ◽  
Vol 7 (7_suppl5) ◽  
pp. 2325967119S0039
Author(s):  
Mininder S. Kocher ◽  
Alison E. Field ◽  
Frances Tepolt

Objectives: Sports specialization has become increasingly common among youth. Our goal was to examine the independent prospective associations of sports specialization and volume of activity with injury risk in youth. Methods: A prospective analysis in 2018 using data collected from 10,138 youth in the Growing Up Today Study, a prospective cohort study of youth throughout the United States, and their mothers. Activity was assessed via questionnaires in 1997, 1998, 1999 and 2001. Sports specialization was defined as engaging in one sport in the fall, winter, and spring. Injury history was provided by participants’ mothers via questionnaire in 2004. The outcome was incident stress fracture, tendinitis, chondromalacia patella or anterior cruciate ligament tear. Results: Females who engaged in sports specialization were at increased risk of injury (females: hazard ratio (HR)=1.31, 95% confidence interval (CI) 1.07 -1.61), but risk varied by sport. In both genders sports specialization was associated with greater volume of physical activity (p<0.0001). Total hours/week of vigorous activity was predictive of developing injury (males: HR=1.04, 95% CI 1.02 -1.06; females: HR=1.06, 95% CI 1.05 -1.08). Among the females even those engaging in 3-3.9 hours per week less than their age were at a significantly increased risk of injury (HR=1.93, 95% CI 1.34-2.77). In males there was not a clear pattern of risk. Conclusion: Sports specialization is associated with higher volume of vigorous sports activity and increased risk of injury. Parents, coaches, and medical providers need to be made aware of the volume threshold above which physical activity is excessive.


2020 ◽  
Vol 8 (8) ◽  
pp. 232596712094632
Author(s):  
Ahmed Khalil Attia ◽  
Hazem Nasef ◽  
Kareem Hussein ElSweify ◽  
Mohammed A. Adam ◽  
Faris AbuShaaban ◽  
...  

Background: Anterior cruciate ligament reconstruction (ACLR) with hamstring autograft has gained popularity. However, an unpredictably small graft diameter has been a drawback of this technique. Smaller graft diameter has been associated with increased risk of revision, and increasing the number of strands has been reported as a successful technique to increase the graft diameter. Purpose: To compare failure rates of 5-strand (5HS) and 6-strand (6HS) hamstring autograft compared with conventional 4-strand (4HS) hamstring autograft. We describe the technique in detail, supplemented by photographs and illustrations, to provide a reproducible technique to avoid the variable and often insufficient 4HS graft diameter reported in the literature. Study Design: Cohort study; Level of evidence, 3. Methods: We retrospectively reviewed prospectively collected data of all primary hamstring autograft ACLRs performed at our institution with a minimum 2-year follow-up and 8.0-mm graft diameter. A total of 413 consecutive knees met the study inclusion and exclusion criteria. The study population was divided into 5HS and 6HS groups as well as a 4HS control group. The primary outcome was failure of ACLR, defined as persistent or recurrent instability and/or revision ACLR. Results: The analysis included 224, 156, and 33 knees in the 5HS, 6HS, and 4HS groups, respectively. The overall ACLR failure rate in this study was 11 cases (8%): 5 cases for 5HS, 3 cases for 6HS, and 3 cases for 4HS. No statistically significant differences were found among groups ( P = .06). The mean graft diameter was 9 mm, and the mean follow-up was 44.27 months. Conclusion: The 5HS and 6HS constructs have similar failure rates to the conventional 4HS construct of 8.0-mm diameter and are therefore safe and reliable to increase the diameter of relatively smaller hamstring autografts. We strongly recommend using this technique when the length of the tendons permits to avoid failures reportedly associated with inadequate graft size.


2017 ◽  
Vol 5 (2) ◽  
pp. 232596711769194 ◽  
Author(s):  
Matthew J. Kraeutler ◽  
John W. Belk ◽  
Eric C. McCarty

Background: In recent years, several studies have correlated pitch count with an increased risk for injury among baseball pitchers. However, no studies have attempted to draw a similar conclusion based on number of carries by running backs (RBs) in football. Purpose: To determine whether there is a correlation between number of carries by RBs in the National Football League (NFL) and risk of injury or worsened performance in the subsequent season. Study Design: Cohort study; Level of evidence, 3. Methods: The ESPN NFL statistics archives were searched from the 2004 through 2014 regular seasons. During each season, data were collected on RBs with 150 to 250 carries (group A) and 300+ carries (group B). The following data were collected for each player and compared between groups: number of carries and mean yards per carry during the regular season of interest and the subsequent season, number of games missed due to injury during the season of interest and the subsequent season, and the specific injuries resulting in missed playing time during the subsequent season. Matched-pair t tests were used to compare changes within each group from one season to the next in terms of number of carries, mean yards per carry, and games missed due to injury. Results: During the seasons studied, a total of 275 RBs were included (group A, 212; group B, 63). In group A, 140 RBs (66%) missed at least 1 game the subsequent season due to injury, compared with 31 RBs (49%) in group B ( P = .016). In fact, players in group B missed significantly fewer games due to injury during the season of interest ( P < .0001) as well as the subsequent season ( P < .01). Mean yards per carry was not significantly different between groups in the preceding season ( P = .073) or the subsequent season ( P = .24). Conclusion: NFL RBs with a high number of carries are not placed at greater risk of injury or worsened performance during the subsequent season. These RBs may be generally less injury prone compared with other NFL RBs.


2017 ◽  
Vol 45 (9) ◽  
pp. 2085-2091 ◽  
Author(s):  
Kristian Samuelsson ◽  
Robert A. Magnussen ◽  
Eduard Alentorn-Geli ◽  
Ferid Krupic ◽  
Kurt P. Spindler ◽  
...  

Background: It is not clear whether Knee injury and Osteoarthritis Outcome Score (KOOS) results will be different 1 or 2 years after anterior cruciate ligament (ACL) reconstruction. Purpose: To investigate within individual patients enrolled in the Swedish National Knee Ligament Register whether there is equivalence between KOOS at 1 and 2 years after primary ACL reconstruction. Study Design: Cohort study; Level of evidence, 2. Methods: This cohort study was based on data from the Swedish National Knee Ligament Register during the period January 1, 2005, through December 31, 2013. The longitudinal KOOS values for each individual at the 1- and 2-year follow-up evaluations were assessed through the two one-sided test (TOST) procedure with an acceptance criterion of 4. Subset analysis was performed with patients classified by sex, age, graft type, and type of injury (meniscal and/or cartilage injury). Results: A total of 23,952 patients were eligible for analysis after exclusion criteria were applied (10,116 women, 42.2%; 13,836 men, 57.8%). The largest age group was between 16 and 20 years of age (n = 6599; 27.6%). The most common ACL graft was hamstring tendon (n = 22,504; 94.0%), of which the combination of semitendinosus and gracilis was the most common. A total of 7119 patients reported on the KOOS Pain domain at both 1- and 2-year follow-ups, with a mean difference of 0.21 (13.1 SD, 0.16 SE [90% CI, −0.05 to 0.46], P < .001). The same results were found for the other KOOS subscales: symptoms (mean difference −0.54, 14.1 SD, 0.17 SE [90% CI, −0.81 to −0.26], P < .001), activities of daily living (mean difference 0.45, 10.8 SD, 0.13 SE [90% CI, 0.24 to 0.66], P < .001), sports and recreation (mean difference −0.35, 22.7 SD, 0.27 SE [90% CI, −0.79 to 0.09], P < .001), quality of life (mean difference −0.92, 20.0 SD, 0.24 SE [90% CI, −1.31 to −0.53], P < .001), and the combined KOOS-4 score (mean difference −0.41, 14.5 SD, 0.17 SE [90% CI, −0.70 to −0.13], P < .001). Analyses within specific subsets of patients showed equivalent results between the 2 follow-up evaluations. Conclusion: Equivalent results within patients were found in KOOS values at 1- and 2-year follow-ups after ACL reconstruction. The finding was consistent across all KOOS subscales and for all evaluated subsets of patients. This result implies that there is no additional value in capturing both 1- and 2-year KOOS outcomes after ACL reconstruction. However, these findings of equivalence at 1- and 2-year endpoints do not alleviate the need for longer follow-up periods.


2017 ◽  
Vol 45 (6) ◽  
pp. 1333-1340 ◽  
Author(s):  
Gregory B. Maletis ◽  
Jason Chen ◽  
Maria C.S. Inacio ◽  
Rebecca M. Love ◽  
Tadashi T. Funahashi

Background: The use of allograft tissue for anterior cruciate ligament reconstruction (ACLR) remains controversial. Purpose: To compare the risk of aseptic revision between bone–patellar tendon–bone (BPTB) autografts and BPTB allografts. Study Design: Cohort study; Level of evidence, 2. Methods: A retrospective cohort study of prospectively collected data was conducted using the Kaiser Permanente ACLR Registry. A cohort of patients who underwent primary unilateral ACLR with BPTB autografts and BPTB allografts was identified. Aseptic revision was the endpoint. The type of graft and allograft processing method (nonprocessed, <1.8-Mrad, and ≥1.8-Mrad irradiation) were the exposures of interest evaluated. Age (≤21 and ≥22 years) was evaluated as an effect modifier. Analyses were adjusted for age, sex, and race. Kaplan-Meier curves and Cox proportional hazards models were employed. Hazard ratios (HRs) and 95% CIs are provided. Results: The BPTB cohort consisted of 5586 patients: 3783 (67.7%) were male, 2359 (42.2%) were white, 1029 (18.4%) had allografts (nonprocessed: 155; <1.8 Mrad: 525; ≥1.8 Mrad: 288), and 4557 (81.6%) had autografts. The median age was 34.9 years (interquartile range [IQR], 25.4-44.0) for allograft cases and 22.0 years (IQR, 17.6-30.0) for autograft cases. The estimated cumulative revision rate at 2 years was 4.1% (95% CI, 2.9%-5.9%) for allografts and 1.7% (95% CI, 1.3%-2.2%) for autografts. BPTB allografts had a significantly higher adjusted risk of revision than BPTB autografts (HR, 4.54; 95% CI, 3.03-6.79; P < .001). This higher risk of revision was consistent with all allograft processing methods when compared with autografts and was also consistently higher in patients with allografts regardless of age. Conclusion: When BPTB allograft tissue was used for ACLR, an overall 4.54 times adjusted higher risk of revision was observed compared with surgery performed with a BPTB autograft. Whether the tissue was irradiated with either high- or low-dose radiation, chemically processed, or not processed at all made little difference in the risk of revision. The differences in the revision risk were also consistent in younger and older patients. Surgeons and patients should be aware of the increased risk of revision when a BPTB allograft is used for ACLR.


2021 ◽  
Vol 9 (2) ◽  
pp. 232596712097637
Author(s):  
Ning Tang ◽  
Wenchao Zhang ◽  
Daniel M. George ◽  
Yang Su ◽  
Tianlong Huang

Background: The concept of anterior cruciate ligament (ACL) reconstruction (ACLR) has become widely accepted, gaining increased attention in recent years and resulting in many research achievements in this field. Purpose: The aim of this study was to determine which original articles on ACLR have been most influential in this field by identifying and analyzing the characteristics of the 100 most cited articles. Study Design: Cross-sectional study. Methods: Articles on ACLR were identified via the Thomson ISI Web of Science database on November 30, 2019. The 100 most cited articles were identified based on inclusion and exclusion criteria. The data extracted from each article for the subsequent analysis included title, date of publication, total citations, average citations per year (ACY), journal name, first author, institutions, themes, level of evidence, and keywords. Results: The total number of citations was 29,629. The date of publication ranged from 1975 to 2015. A majority of the articles originated from the United States (58%) and were published in the 1990s (32%) and 2000s (48%). The mean ACY was 18.43 ± 9.51. Of the selected articles, nearly one-half were published in the American Journal of Sports Medicine (42%). The most prolific co-author and first author were Freddie H. Fu (n = 13) and K. Donald Shelbourne (n = 5), respectively. The most productive institution was the University of Pittsburgh (14%). Material comparison (19%) and technique comparison (16%) were the 2 most popular themes. More than one-quarter of articles were level 4 evidence (37%). Moreover, the keywords ACL, ACL reconstruction, ACL rupture, knee joint, knee injuries, and human showed the highest degree of centrality. Conclusion: By analyzing the characteristics of articles, this study demonstrated that ACLR is a growing and popular area of research, with the focus of research varying through timeline trends. Studies on anatomic reconstruction and biomechanics might be areas of future trends.


2019 ◽  
Vol 47 (4) ◽  
pp. 807-814 ◽  
Author(s):  
Louise M. Thoma ◽  
Hege Grindem ◽  
David Logerstedt ◽  
Michael Axe ◽  
Lars Engebretsen ◽  
...  

Background: Some athletes demonstrate excellent dynamic stability after anterior cruciate ligament (ACL) rupture and return to sport without ACL reconstruction (ACLR) (copers). Others demonstrate persistent instability despite rehabilitation (noncopers) and require surgical stabilization. Testing to determine coper classification can identify potential copers early after rupture. It is unclear how coper classification changes after a brief intervention and how early classification relates to long-term outcomes. Purpose: (1) To evaluate the consistency of early coper classification (potential coper vs noncoper) before and after progressive neuromuscular and strength training (NMST) among athletes early after acute ACL rupture and (2) to evaluate the association of early coper classification with 2-year success after ACL rupture. Study Design: Cohort study; Level of evidence, 2. Methods: This was a prospective analysis from the Delaware-Oslo ACL Cohort Study, composed of athletes consecutively enrolled early after ACL rupture. Participants (n = 271) were tested and classified as potential copers or noncopers according to established criteria before and after a 10-session NMST program. Success 2 years after ACLR or nonoperative rehabilitation was defined as meeting or exceeding sex- and age-matched norms for knee function, no ACL graft rupture, and ≤1 episode of giving way within the previous year. The McNemar test evaluated changes in coper classification pre- to posttraining. Logistic regression adjusted for baseline characteristics was used to evaluate the association of early coper classification and surgical status with 2-year success. Results: Of 300 athletes enrolled, 271 (90%) completed the posttraining data collection, and 219 (73%) returned for the 2-year follow-up. The coper classifications were different between time points: nearly half of those classified initially as noncopers became potential copers ( P < .001). At the 2-year follow-up, 66% of the ACLR group and 74% of the nonoperative group were successful. Athletes who were potential copers posttraining and chose ACLR or nonoperative rehabilitation had 2.7 (95% CI, 1.3-5.6) and 2.9 (95% CI, 1.2-7.2) times the odds of success, respectively, as compared with noncopers who chose ACLR. Conclusion: Coper classification improved after NMST; more athletes became potential copers. Athletes who were potential copers after NMST were more likely to succeed 2 years later regardless of whether they had surgery, strongly supporting the addition of NMST before ACLR. Persistent noncopers fared poorly, indicating that more intensive rehabilitation may be needed.


2007 ◽  
Vol 36 (2) ◽  
pp. 276-284 ◽  
Author(s):  
Franck Le Gall ◽  
Christopher Carling ◽  
Thomas Reilly

Background Epidemiologic data on injuries in young female soccer players at elite levels are scarce. Purpose The aim of the present study is to investigate the incidence of soccer-related injuries in young elite female French players. Study Design Cohort study; Level of evidence, 2. Methods Injuries sustained by players between 15 and 19 years of age, during 8 seasons, were diagnosed and documented by a sports physician according to type, location, severity, the date the injury occurred, and playing position. Results Altogether 619 injuries were documented for 110 players (92.4%). Of these injuries, 64.6% (4.6/1000 training hours; 95% confidence interval [CI], 4.2–5.0) and 35.4% (22.4/1000 match hours; 95% CI, 19.4–25.4) were sustained during training and matches, respectively. The risk of injury was greater in the youngest (under age 15) group compared with the oldest (under 19) group (relative risk 1.7; 95% CI, 1.3–2.3). Traumatic injuries amounted to 536 (86.4%) and 83 (13.4%) were overuse injuries. There were 51.9% minor injuries, 35.7% moderate injuries, and 12.4% major injuries. Most injuries were located at the lower extremities (83.4%), with the majority affecting the ankle (n = 157). The most commonly diagnosed injury was ankle sprain (16.8%). Twelve anterior cruciate ligament ruptures were sustained, with the majority occurring during matches (n = 10; 1.0/1000 match hours; 95% CI, 0.4–1.6). Reinjuries accounted for 4.4% of total injuries, and September was the predominant month for injury (14.2%). Conclusions The results, when compared with those of other investigations on female soccer players, revealed high rates of both traumatic injury and match injury, whereas recurrence of injury was low. Injuries, notably sprains, to the ankle were common, suggesting a need for the implementation of specific injury prevention strategies for this joint.


Open Medicine ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. 833-842
Author(s):  
Mikołaj Wypych ◽  
Robert Lundqvist ◽  
Dariusz Witoński ◽  
Rafał Kęska ◽  
Anna Szmigielska ◽  
...  

Abstract Objective The retrospective investigation was carried out to assess whether subjects who fulfilled our proposed recruitment criteria responded more favorably to anterior cruciate ligament reconstruction (ACLR) than those who did not. Methods We retrospectively analyzed 109 skeletally mature subjects (78 men and 31 women) according to the following proposed criteria of recruitment: (1) pre-injury Tegner activity score ≥7 and a wish to return to a professional sports activity, (2) residual knee instability following injury and/or (3) age <20 years at the operation. The primary outcome was an improvement between assessment A (before operation) and B (mean follow-up of 1.6 years) in the average score for four of the five Knee injury and Osteoarthritis Outcome Score (KOOS) subscales, covering pain, symptoms, difficulty in sports and recreational activities, and quality of life (KOOS4). Results The proposed recruitment criteria for ACLR were met by 58 subjects (53%). There were 49 subjects (45%) who improved between assessment A and B. Subjects who met proposed recruitment criteria were more likely to improve clinically after ACLR (OR 5.7, 95% CI 2.5–13.3). Conclusions Fulfillment of proposed recruitment criteria was a strong predictive factor for outcome improvement in short- to medium-term follow-up after ACLR. Level of evidence Case-control study. Level of evidence 3.


Sign in / Sign up

Export Citation Format

Share Document