Lumbopelvic Control and the Development of Upper Extremity Injury in Professional Baseball Pitchers

2021 ◽  
pp. 036354652098812
Author(s):  
Kevin Laudner ◽  
Regan Wong ◽  
Daniel Evans ◽  
Keith Meister

Background: The baseball-throwing motion requires a sequential order of motions and forces initiating in the lower limbs and transferring through the trunk and ultimately to the upper extremity. Any disruption in this sequence can increase the forces placed on subsequent segments. No research has examined if baseball pitchers with less lumbopelvic control are more likely to develop upper extremity injury than pitchers with more control. Purpose: To determine if baseball pitchers who sustain a chronic upper extremity injury have less lumbopelvic control before their injury compared with a group of pitchers who do not sustain an injury. Study Design: Cohort study; Level of evidence, 2. Methods: A total of 49 asymptomatic, professional baseball pitchers from a single Major League Baseball organization participated. Lumbopelvic control was measured using an iPod-based digital level secured to a Velcro belt around each player’s waist to measure anteroposterior (AP) and mediolateral (ML) deviations (degrees) during single-leg balance with movement and static bridge maneuvers. During a competitive season, 22 of these pitchers developed upper extremity injuries, while the remaining 27 sustained no injuries. Separate 2-tailed t-tests were run to determine if there were significant differences in lumbopelvic control between groups ( P < .05). Results: There were no significant between-group differences for the stride leg (nondominant) during the bridge test in either the AP ( P = .79) or the ML ( P = .42) directions, or either direction during the drive leg bridge test ( P > .68). However, the injured group had significantly less lumbopelvic control than the noninjured group during stride leg balance in both the AP ( P = .03) and the ML ( P = .001) directions and for drive leg balance in both the AP ( P = .01) and the ML ( P = .04) directions. Conclusion: These results demonstrate that baseball pitchers with diminished lumbopelvic control, particularly during stride leg and drive leg single-leg balance with movement, had more upper extremity injuries than those with more control. Clinicians should consider evaluating lumbopelvic control in injury prevention protocols and provide appropriate exercises for restoring lumbopelvic control before returning athletes to competition after injury. Specific attention should be given to testing and exercises that mimic a single-limb balance task.

2018 ◽  
Vol 53 (5) ◽  
pp. 510-513 ◽  
Author(s):  
Elizabeth E. Hibberd ◽  
Sakiko Oyama ◽  
Joseph B. Myers

Context:  Many high school pitchers play another position after they have finished pitching for the day or on their rest days from pitching. Because of the cumulative demands on the arm, pitchers who also play catcher may have a greater risk of developing a throwing-related shoulder or elbow injury. Objective:  To compare the rate of throwing-related upper extremity injuries between high school baseball pitchers who also played catcher as a secondary position and those who did not play catcher. Design:  Prospective cohort study. Setting:  Field laboratory. Patients or Other Participants:  A total of 384 male high school baseball pitchers were recruited from 51 high school teams. Pitchers who reported their secondary position as catcher were classified into the pitcher/catcher group and those who did not report playing catcher as a secondary position were classified into the other group. Main Outcome Measure(s):  Participants completed a demographic questionnaire preseason and then athlete participation and injury status were tracked during the subsequent season. Athlete-exposures were monitored and the shoulder and elbow injury proportion rates were calculated. Results:  Athlete-exposures did not differ between groups (P = .488). The pitcher/catcher group's risk of shoulder or elbow injury was 2.9 times greater than that of the other pitchers (15% versus 5%; injury proportion rate = 2.9; 95% confidence interval = 1.03, 8.12). Conclusions:  Pitchers who reported also playing catcher were at a greater risk of sustaining a throwing-related shoulder or elbow injury than the other pitchers. These findings suggest that pitchers should consider not playing catcher as their secondary position in order to allow adequate time for recovery and to decrease their overall throwing load. Serial physical examinations of pitchers/catchers during the season may be useful in determining if their physical characteristics are changing during the season because of the cumulative throwing load.


2021 ◽  
pp. 194173812098655
Author(s):  
Jason Croci ◽  
Jim Nicknair ◽  
John Goetschius

Background: Evidence suggests that shoulder and elbow injuries account for 31% to 37% of all National Collegiate Athletic Association (NCAA) baseball injuries, and up to 69% of NCAA baseball injuries are the result of noncontact and overuse mechanisms. Early sport specialization may contribute to the high rates of upper extremity injuries in college baseball players. Hypothesis: Higher specialization by age 13 years would be associated with worse subjective throwing arm function and a greater history of shoulder and elbow injury. Study Design: Cohort study. Level of Evidence: Level 2. Methods: Survey data were collected from college baseball players (N = 129) during midseason of the spring 2019 baseball season. Participants were stratified in low, moderate, and high specialization groups based on a 3-criteria sports specialization questionnaire. Participants’ throwing arm function was measured using the Functional Arm Scale for Throwers and the Kerlan-Jobe Orthopaedic Clinic shoulder and elbow questionnaires. Participants’ history of a shoulder or elbow injury that resulted in missing ≥2 weeks of baseball activity at any point in their baseball career was also collected. Results: The high specialization group reported worse subjective throwing arm function on the Functional Arm Scale for Throwers questionnaire than the low ( P = 0.03) and moderate ( P = 0.01) specialization groups. The high specialization group was over 5 times more likely to report a history of shoulder injury than the moderate (odds ratio [OR] = 5.42; 95% CI [1.71, 17.2]; P = 0.004) and low (OR = 5.20; 95% CI [1.87, 14.5]; P = 0.002) specialization groups, and over 3 times more likely to report a history of elbow injury than the moderate specialization group (OR = 3.77; 95% CI [1.05, 13.6]; P = 0.04). Conclusion: College baseball players that were highly specialized by age 13 years reported worse subjective throwing arm function and were more likely to have a history of upper extremity injury than players that were moderate or low specialization. Clinical Relevance: Early specialization in baseball may be detrimental to long-term upper extremity health in college baseball players.


2016 ◽  
Vol 45 (2) ◽  
pp. 317-324 ◽  
Author(s):  
Christopher S. Ahmad ◽  
Ajay S. Padaki ◽  
Manish S. Noticewala ◽  
Eric C. Makhni ◽  
Charles A. Popkin

Background: Epidemic levels of shoulder and elbow injuries have been reported recently in youth and adolescent baseball players. Despite the concerning frequency of these injuries, no instrument has been validated to assess upper extremity injury in this patient population. Purpose/Hypothesis: The purpose of this study was to validate an upper extremity assessment tool specifically designed for young baseball players. We hypothesized that this tool will be both reliable and valid. Study Design: Cohort study (diagnosis); Level of evidence, 2. Methods: The Youth Throwing Score (YTS) was constructed by an interdisciplinary team of providers and coaches as a tool to assess upper extremity injury in youth and adolescent baseball players (age range, 10-18 years). The psychometric properties of the test were then determined. Results: A total of 223 players completed the final survey. The players’ mean age was 14.3 ± 2.7 years. Pilot analysis showed that none of the 14 questions received a mean athlete importance rating less than 3 of 5, and the final survey read at a Flesch-Kincaid level of 4.1, which is appropriate for patients aged 9 years and older. The players self-assigned their injury status, resulting in a mean instrument score of 59.7 ± 8.4 for the 148 players “playing without pain,” 42.0 ± 11.5 for the 60 players “playing with pain,” and 40.4 ± 10.5 for the 15 players “not playing due to pain.” Players playing without pain scored significantly higher than those playing with pain and those not playing due to pain ( P < .001). Psychometric analysis showed a test-retest intraclass correlation coefficient of 0.90 and a Cronbach alpha intra-item reliability coefficient of 0.93, indicating excellent reliability and internal consistency. Pearson correlation coefficients of 0.65, 0.62, and 0.31 were calculated between the YTS and the Pediatric Outcomes Data Collection Instrument sports/physical functioning module, the Kerlan-Jobe Orthopaedic Clinic Shoulder and Elbow score, and the Quick Disabilities of the Arm, Shoulder, and Hand (QuickDASH) score, respectively. Injured players scored a mean of 9.4 points higher after treatment ( P < .001), and players who improved in their self-assigned pain categorization scored 16.5 points higher ( P < .001). Conclusion: The YTS is the first valid and reliable instrument for assessing young baseball players’ upper extremity health.


2021 ◽  
Vol 9 (1) ◽  
pp. 232596712097709
Author(s):  
Kaysha Heck ◽  
Giorgio Zeppieri ◽  
Michelle Bruner ◽  
Michael Moser ◽  
Kevin W. Farmer ◽  
...  

Background: Gymnastics is a demanding sport that places unique forces on the upper extremity. The repetitive nature of the sport and the high-impact forces involved may predispose the gymnast to overuse injuries. Risk factors for injuries in gymnastics are not well understood. Purpose/Hypothesis: The purpose of this study was to ascertain whether preseason upper extremity range of motion (ROM) and strength differ between National Collegiate Athletic Association (NCAA) Division I collegiate gymnasts who sustain an in-season upper extremity injury and those who do not. We hypothesized that gymnasts who sustain an upper extremity injury would demonstrate reduced ROM and strength compared with noninjured gymnasts. Study Design: Cohort study; Level of evidence, 3. Methods: Over 4 seasons, from 2014 to 2018, a total of 15 female NCAA Division I collegiate gymnasts underwent preseason upper extremity ROM (shoulder: flexion, internal and external rotation; elbow: extension; wrist: extension) and strength (shoulder: internal and external rotation, and middle and lower trapezius) testing. Overuse upper extremity injuries were tracked in each subsequent season. Gymnasts were dichotomized into injured and noninjured groups, and a 2 × 2 analysis of variance was used to measure differences in preseason measures between the groups as well as within arms (injured vs noninjured arm for the injured group; dominant vs nondominant arm for the noninjured group). Results: A total of 12 overuse upper extremity injuries (10 shoulders; 2 wrist/forearm) occurred during 31 gymnast-seasons. There were no significant interactions for preseason ROM and strength measurements between groups (injured vs noninjured) or within arms (injured and noninjured arm for the injured group; dominant and nondominant arm for the noninjured group; P = .07). Conclusion: Preseason upper extremity ROM and strength were not different between gymnasts who sustained an in-season upper extremity overuse injury and those who did not. It is possible that ROM and strength measures used to screen other overhead athletes may not capture the unique features and requirements of gymnastics. Further, it may be challenging to discern differences in clinical measures of ROM and strength in gymnastics populations owing to the bilateral nature of the sport.


2011 ◽  
Vol 3 (4) ◽  
pp. 383-389 ◽  
Author(s):  
Joseph B. Myers ◽  
Sakiko Oyama ◽  
Terri Jo Rucinski ◽  
R. Alexander Creighton

Hand ◽  
2012 ◽  
Vol 7 (2) ◽  
pp. 127-133 ◽  
Author(s):  
Jo M. Weis ◽  
Brad K. Grunert ◽  
Heidi Fowell Christianson

Background The consequences following work-related injuries are far reaching, which are in part due to unrecognized and untreated posttraumatic stress disorder (PTSD). Imaginal exposure is a frequently used cognitive behavioral approach for the treatment of PTSD. This study examined the impact of early versus delayed treatment with imaginal exposure on amelioration of PTSD symptomatology in individuals who suffered upper extremity injuries. Methods Sixty individuals who suffered severe work-related injuries received standard, non-randomly assigned psychological treatment for PTSD (e.g., prolonged imaginal exposure) either early (30–60 days) or delayed (greater than 120 days) following severe work-related upper extremity injury. Nine measures of various components of PTSD symptomatology were administered at onset of treatment, end of treatment, and at 6-month follow-up evaluations. Results Patients showed significant treatment outcomes at all three measurement intervals in both the early and delayed groups demonstrating that Prolonged Imaginal Exposure is an appropriate treatment for persons diagnosed with PTSD. In addition, there was no difference in return to work status between the early and the delayed treatment groups. However, the early treatment group required significantly fewer treatment sessions than the delayed treatment group. Conclusions Results supported the utility of imaginal exposure and the need for early assessment and referral for those diagnosed with PTSD following upper extremity injuries.


2020 ◽  
pp. 1-6
Author(s):  
Olivia Bartlett ◽  
James L. Farnsworth

Clinical Scenario: Kinesiophobia is a common psychological phenomenon that occurs following injury involving fear of movement. These psychological factors contribute to the variability among patients’ perceived disability scores following injury. In addition, the psychophysiological, behavioral, and cognitive factors of kinesiophobia have been shown to be predictive of a patient’s self-reported disability and pain. Previous kinesiophobia research has mostly focused on lower-extremity injuries. There are fewer studies that investigate upper-extremity injuries despite the influence that upper-extremity injuries can have on an individual’s activities of daily living and, therefore, disability scores. The lack of research calls for a critical evaluation and appraisal of available evidence regarding kinesiophobia and its contribution to perceived disability for the upper-extremity. Focused Clinical Question: How does kinesiophobia in patients with upper-extremity injuries influence perceptions of disability and quality of life measurements? Summary of Key Findings: Two cross-sectional studies and one cohort study were included. The first study found a positive relationship between kinesiophobia and a high degree of perceived disability. Another study found that kinesiophobia and catastrophic thinking scores were the most important predictors of perceived upper-extremity disability. The third study found that kinesiophobia contributes to self-reported disability in the shoulder. Clinical Bottom Line: There is moderate evidence that supports the relationship between kinesiophobia and perceived disability, and the relationship between elevated perceptions of disability and increased kinesiophobia scores in patients with an upper-extremity injury. Clinicians should evaluate and monitor kinesiophobia in patients following injury, a condition that can enhance perceptions of disability. An elevated perception of disability can create a cycle of fear that leads to hypervigilance and fear-avoidance behavior. Strength of Recommendation: Consistent findings from reviewed studies suggest there is grade B evidence to support that kinesiophobia is related to an increased perceived disability following upper-extremity injuries.


Sign in / Sign up

Export Citation Format

Share Document