Anthropometric Estimations of Percent Body Fat in NCAA Division I Female Athletes: A 4-Compartment Model Validation

2009 ◽  
Vol 23 (4) ◽  
pp. 1068-1076 ◽  
Author(s):  
Jordan R Moon ◽  
Sarah E Tobkin ◽  
Abbie E Smith ◽  
Chris M Lockwood ◽  
Ashley A Walter ◽  
...  
2008 ◽  
Vol 105 (1) ◽  
pp. 119-130 ◽  
Author(s):  
Jordan R. Moon ◽  
Joan M. Eckerson ◽  
Sarah E. Tobkin ◽  
Abbie E. Smith ◽  
Christopher M. Lockwood ◽  
...  

2021 ◽  
Vol 9 (6) ◽  
pp. 232596712110152
Author(s):  
Rafael Sanchez ◽  
Blake H. Hodgens ◽  
Joseph S. Geller ◽  
Samuel Huntley ◽  
Jonathan Kaplan ◽  
...  

Background: Achilles tendon (AT) ruptures are devastating injuries that are highly prevalent among athletes. Despite our understanding of the effect of AT rupture and in particular its relationship to basketball, no study has examined the effects of AT rupture and repair on performance metrics in collegiate basketball players. Purpose: To evaluate the effect of AT rupture and subsequent surgical repair on performance metrics in National Collegiate Athletic Association (NCAA) Division I basketball players who return to play after injury. Study Design: Descriptive epidemiology study. Methods: NCAA Division I basketball players who sustained an AT rupture and underwent subsequent surgical repair between 2000 and 2019 were identified by systematically evaluating individual injury reports from databases comprising NCAA career statistics and individual school statistics; 65 male and 41 female players were identified. Athletes were included if they participated in at least one-half of the games of 1 collegiate season before tearing the AT and at least 1 season after operative repair. A total of 50 male and 30 female athletes were included. Each injured athlete was matched to a healthy control by conference, position, starter status at time of injury, class year, and number of games played. Matched controls were healthy players and experienced no significant injuries during their NCAA careers. Results: After AT repair, male athletes had significantly more minutes per game, points per game, and compared with before injury. Total blocks significantly decreased after injury. Female athletes scored significantly more points per game but demonstrated a significantly lower 3-point shooting percentage after return to play. Despite undergoing AT rupture and repair, 14% of male players played in the National Basketball Association, and 20% of injured female athletes played in the Women’s National Basketball Association. Conclusion: After returning to play, men demonstrated a significant drop-off in performance only in regard to total blocks. Female athletes after AT repair demonstrated a significant improvement in points per game but had a significant drop-off in 3-point shooting percentage.


Author(s):  
A R Russell ◽  
M R Esco ◽  
S N Lizana ◽  
H N Williford ◽  
M S Olson ◽  
...  

2020 ◽  
Vol 42 (6) ◽  
pp. 490-499
Author(s):  
Stephanie L. Barrett ◽  
Trent A. Petrie

Although researchers have examined eating disorders in female athletes, few such studies have been done with athletes who are retired, and even fewer have been quantitative. Thus, the authors empirically tested an established eating disorder theoretical model with 218 former NCAA Division-I female collegiate athletes who had been retired for 2–6 years. In retirement, participants completed measures of general sociocultural pressures related to body and appearance, thin-ideal internalization, body dissatisfaction, dietary restraint, negative affect, and bulimic symptomatology. Through structural equation modeling, the authors examined the direct and indirect relationships among the latent variables while controlling for body mass index and years since retirement. The model fit the data well, supporting the hypothesized direct and indirect relationships among the variables and explaining 54% of the variance in bulimic symptomatology. Despite no longer being exposed to sport pressures that contribute to eating disorders, female athletes experience such symptoms long into retirement.


2018 ◽  
Vol 50 (5S) ◽  
pp. 146
Author(s):  
Takudzwa A. Madzima ◽  
Svetlana Nepocatych ◽  
Daniel A. Baur ◽  
Kirtida Patel ◽  
Walter R. Bixby

2003 ◽  
Vol 13 (3) ◽  
pp. 277-285 ◽  
Author(s):  
Brandy S. Cowell ◽  
Christine A. Rosenbloom ◽  
Robert Skinner ◽  
Stephanie H. Summers

Iron deficiency is the most prevalent nutritional deficiency in the United States. This condition has been reported to affect 60% of female athletes. The Centers for Disease Control and Prevention emphasize screening for anemia in women of childbearing age. The purpose of this study was to determine the number of National Collegiate Athletic Association (NCAA) Division I-A schools that implement screening for iron deficiency in female athletes as well as the screening policies for those who do. A link to an online survey was sent to 94 NCAA Division I-A schools to determine current practices concerning screening and treating female athletes for iron deficiency. There was a 58% response rate. Frequencies for each response were computed. Forty-three percent of responding institutions report screening female athletes for iron deficiency. This study suggests that screening for iron deficiency in female athletes at NCAA Division I-A schools is not a routine procedure and, for those who do screen, variability exists in the criteria for diagnosis, as well as in treatment protocols. Standard protocols for assessment and treatment of iron deficiency in female athletes need to be developed and implemented.


2017 ◽  
Vol 9 (5) ◽  
pp. 462-468 ◽  
Author(s):  
Janet E. Simon ◽  
Carrie L. Docherty

Background: Physical activity performed at moderate intensity is associated with reduced risk of mortality, cardiovascular disease, hypertension, and some types of cancers. However, vigorous physical activity during participation in college athletics may increase the risk of injury, which might limit future physical activity levels. Purpose: To evaluate differences in current physical fitness levels between former Division I athletes and noncollegiate athletes. Study Design: Cross-sectional study. Level of Evidence: Level 3. Methods: The sample was recruited from a large midwestern university alumni database and consisted of 2 cohorts: (1) former Division I athletes (n = 100; mean age, 53.1 ± 7.4 years) and (2) nonathletes who were active in college (n = 100; age, 51.4 ± 7.3 years). Individuals answered a demographics questionnaire and completed a physical fitness assessment consisting of 7 measures: percent body fat, 1-mile walk, sit-to-stand test, push-up, half sit-up test, sit and reach test, and back scratch test. Results: Performance was significantly worse for former Division I athletes compared with nonathletes for percent body fat (mean difference, 7.58%; F(1, 198) = 59.91; P < 0.01), mile time (mean difference, 2.42 minutes; F(1, 198) = 1.74; P = 0.03), sit-to-stand test (mean difference, 4.3 repetitions; F(1, 198) = 6.59; P = 0.01), and push-up test (mean difference, 8.9 repetitions; F(1, 198) = 7.35; P = 0.01). Conclusion: Former Division I athletes may be limited because of previous injury, inhibiting their ability to stay active later in life. Clinical Relevance: It is imperative that clinicians, coaches, and strength and conditioning specialists understand the possible future repercussions from competing at the Division I level.


2008 ◽  
Vol 7 (1) ◽  
pp. 7 ◽  
Author(s):  
Jordan R Moon ◽  
Sarah E Tobkin ◽  
Abbie E Smith ◽  
Michael D Roberts ◽  
Eric D Ryan ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document