Weekly Fluctuations in Salivary Hormone Responses and Their Relationships With Load and Well-Being in Semiprofessional, Male Basketball Players During a Congested In-Season Phase

Author(s):  
Paulius Kamarauskas ◽  
Inga Lukonaitienė ◽  
Aaron T. Scanlan ◽  
Davide Ferioli ◽  
Henrikas Paulauskas ◽  
...  

Purpose: To assess weekly fluctuations in hormonal responses and their relationships with load and well-being during a congested in-season phase in basketball players. Methods: Ten semiprofessional, male basketball players were monitored during 4 congested in-season phase weeks consisting of 3 weekly matches. Salivary hormone variables (testosterone [T], cortisol [C], and T:C ratio) were measured weekly, and external load (PlayerLoad™ and PlayerLoad per minute), internal load session rating of perceived exertion, percentage of maximum heart rate (HR), summated HR zones, and well-being were assessed for each training session and match. Results: Significant (P < .05) moderate to large decreases in T were found in the third and fourth weeks compared with the first week. Nonsignificant moderate to large decreases in C were apparent in the last 2 weeks compared with previous weeks. Summated HR zones and perceived sleep significantly (P < .05) decreased in the fourth week compared with the first week; whereas, percentage of maximum HR significantly (P < .05) decreased in the fourth week compared with the second week. No significant relationships were found between weekly changes in hormonal responses and weekly changes in load and overall wellness. Conclusions: A congested schedule during the in-season negatively impacted the hormonal responses of players, suggesting that T and C measurements may be useful to detect fluctuations in hormone balance in such scenarios. The nonsignificant relationships between weekly changes in hormonal responses and changes in load and well-being indicate that other factors might induce hormonal changes across congested periods in basketball players.

2021 ◽  
Vol 16 (1) ◽  
pp. 45-50
Author(s):  
Steven H. Doeven ◽  
Michel S. Brink ◽  
Barbara C.H. Huijgen ◽  
Johan de Jong ◽  
Koen A.P.M. Lemmink

In elite basketball, players are exposed to intensified competition periods when participating in both national and international competitions. How coaches manage training between matches and in reference to match scheduling for a full season is not yet known. Purpose: First, to compare load during short-term match congestion (ie, ≥2-match weeks) with regular competition (ie, 1-match weeks) in elite male professional basketball players. Second, to determine changes in well-being, recovery, neuromuscular performance, and injuries and illnesses between short-term match congestion and regular competition. Methods: Sixteen basketball players (age 24.8 [2.0] y, height 195.8 [7.5] cm, weight 94.8 [14.0] kg, body fat 11.9% [5.0%], VO2max 51.9 [5.3] mL·kg−1·min−1) were monitored during a full season. Session rating of perceived exertion (s-RPE) was obtained, and load was calculated (s-RPE × duration) for each training session or match. Perceived well-being (fatigue, sleep quality, general muscle soreness, stress levels, and mood) and total quality of recovery were assessed each training day. Countermovement jump height was measured, and a list of injuries and illnesses was collected weekly using the adapted Oslo Sports Trauma Research Center Questionnaire on Health Problems. Results: Total load (training sessions and matches; P < .001) and training load (P < .001) were significantly lower for ≥2-match weeks. Significantly higher well-being (P = .01) and less fatigue (P = .001) were found during ≥2-match weeks compared with 1-match weeks. Conclusion: Total load and training load were lower during short-term match congestion compared with regular competition. Furthermore, better well-being and less fatigue were demonstrated within short-term match congestion. This might indicate that coaches tend to overcompensate training load in intensified competition.


2019 ◽  
Vol 14 (7) ◽  
pp. 941-948 ◽  
Author(s):  
Henrikas Paulauskas ◽  
Rasa Kreivyte ◽  
Aaron T. Scanlan ◽  
Alexandre Moreira ◽  
Laimonas Siupsinskas ◽  
...  

Purpose:To assess the weekly fluctuations in workload and differences in workload according to playing time in elite female basketball players.Methods:A total of 29 female basketball players (mean [SD] age 21 [5] y, stature 181 [7] cm, body mass 71 [7] kg, playing experience 12 [5] y) belonging to the 7 women’s basketball teams competing in the first-division Lithuanian Women’s Basketball League were recruited. Individualized training loads (TLs) and game loads (GLs) were assessed using the session rating of perceived exertion after each training session and game during the entire in-season phase (24 wk). Percentage changes in total weekly TL (weekly TL + GL), weekly TL, weekly GL, chronic workload, acute:chronic workload ratio, training monotony, and training strain were calculated. Mixed linear models were used to assess differences for each dependent variable, with playing time (low vs high) used as fixed factor and subject, week, and team as random factors.Results:The highest changes in total weekly TL, weekly TL, and acute:chronic workload ratio were evident in week 13 (47%, 120%, and 49%, respectively). Chronic workload showed weekly changes ≤10%, whereas monotony and training strain registered highest fluctuations in weeks 17 (34%) and 15 (59%), respectively. A statistically significant difference in GL was evident between players completing low and high playing times (P = .026, moderate), whereas no significant differences (P > .05) were found for all other dependent variables.Conclusions:Coaches of elite women’s basketball teams should monitor weekly changes in workload during the in-season phase to identify weeks that may predispose players to unwanted spikes and adjust player workload according to playing time.


2018 ◽  
Vol 13 (8) ◽  
pp. 1067-1074 ◽  
Author(s):  
Daniele Conte ◽  
Nicholas Kolb ◽  
Aaron T. Scanlan ◽  
Fabrizio Santolamazza

Purpose: To characterize the weekly training load (TL) and well-being of college basketball players during the in-season phase. Methods: Ten (6 guards and 4 forwards) male basketball players (age 20.9 [0.9] y, stature 195.0 [8.2] cm, and body mass 91.3 [11.3] kg) from the same Division I National Collegiate Athletic Association team were recruited to participate in this study. Individualized training and game loads were assessed using the session rating of perceived exertion at the end of each training and game session, and well-being status was collected before each session. Weekly changes (%) in TL, acute-to-chronic workload ratio, and well-being were determined. Differences in TL and well-being between starting and bench players and between 1-game and 2-game weeks were calculated using magnitude-based statistics. Results: Total weekly TL and acute-to-chronic workload ratio demonstrated high week-to-week variation, with spikes up to 226% and 220%, respectively. Starting players experienced a higher (most likely negative) total weekly TL and similar (unclear) well-being status compared with bench players. Game scheduling influenced TL, with 1-game weeks demonstrating a higher (likely negative) total weekly TL and similar (most likely trivial) well-being status compared with 2-game weeks. Conclusions: These findings provide college basketball coaches information to optimize training strategies during the in-season phase. Basketball coaches should concurrently consider the number of weekly games and player status (starting vs bench player) when creating individualized periodization plans, with increases in TL potentially needed in bench players, especially in 2-game weeks.


PeerJ ◽  
2018 ◽  
Vol 6 ◽  
pp. e5225 ◽  
Author(s):  
Igor de Freitas Cruz ◽  
Lucas Adriano Pereira ◽  
Ronaldo Kobal ◽  
Katia Kitamura ◽  
Cristiano Cedra ◽  
...  

The aims of this study were to describe the session rating of perceived exertion (sRPE), total quality recovery (TQR), and variations in countermovement jump (CMJ) height throughout nine weeks of a competitive period in young female basketball players. In total, 10 young female basketball players (17.2 ± 0.4 years; 71.8 ± 15.0 kg; 177.2 ± 9.5 cm) participated in this study. The sRPE and TQR were assessed in each training session, whereas the CMJ height was assessed prior to the first weekly training session. The magnitude-based inferences method was used to compare the sRPE, TQR, and CMJ height across the nine weeks of training. The training loads accumulated in weeks 1, 2, and 3 were likely to almost certainly be higher than in the following weeks (ES varying from 0.67 to 2.55). The CMJ height in week 1 was very likely to be lower than in weeks 2, 5, 7, and 8 (ES varying from 0.24 to 0.34), while the CMJ height of the 9th week was likely to almost certainly be higher than all previous weeks of training (ES varying from 0.70 to 1.10). Accordingly, it was observed that when higher training loads were accumulated, both CMJ and TQR presented lower values than those presented during periods with lower internal training loads. These results highlight the importance of using a comprehensive and multivariate approach to effectively monitor the physical performance of young athletes.


2017 ◽  
Vol 12 (9) ◽  
pp. 1151-1156 ◽  
Author(s):  
Steven H. Doeven ◽  
Michel S. Brink ◽  
Wouter G.P. Frencken ◽  
Koen A.P.M. Lemmink

During intensified phases of competition, attunement of exertion and recovery is crucial to maintain performance. Although a mismatch between coach and player perceptions of training load is demonstrated, it is unknown if these discrepancies also exist for match exertion and recovery. Purpose:To determine match exertion and subsequent recovery and to investigate the extent to which the coach is able to estimate players’ match exertion and recovery. Methods:Rating of perceived exertion (RPE) and total quality of recovery (TQR) of 14 professional basketball players (age 26.7 ± 3.8 y, height 197.2 ± 9.1 cm, weight 100.3 ± 15.2 kg, body fat 10.3% ± 3.6%) were compared with observations of the coach. During an in-season phase of 15 matches within 6 wk, players gave RPEs after each match. TQR scores were filled out before the first training session after the match. The coach rated observed exertion (ROE) and recovery (TQ-OR) of the players. Results:RPE was lower than ROE (15.6 ± 2.3 and 16.1 ± 1.4; P = .029). Furthermore, TQR was lower than TQ-OR (12.7 ± 3.0 and 15.3 ± 1.3; P < .001). Correlations between coach- and player-perceived exertion and recovery were r = .25 and r = .21, respectively. For recovery within 1 d the correlation was r = .68, but for recovery after 1–2 d no association existed. Conclusion:Players perceive match exertion as hard to very hard and subsequent recovery reasonable. The coach overestimates match exertion and underestimates degree of recovery. Correspondence between coach and players is thus not optimal. This mismatch potentially leads to inadequate planning of training sessions and decreases in performance during fixture congestion in basketball.


Author(s):  
Sullivan Coppalle ◽  
Guillaume Ravé ◽  
Jason Moran ◽  
Iyed Salhi ◽  
Abderraouf Ben Abderrahman ◽  
...  

This study aimed to compare the training load of a professional under-19 soccer team (U-19) to that of an elite adult team (EAT), from the same club, during the in-season period. Thirty-nine healthy soccer players were involved (EAT [n = 20]; U-19 [n = 19]) in the study which spanned four weeks. Training load (TL) was monitored as external TL, using a global positioning system (GPS), and internal TL, using a rating of perceived exertion (RPE). TL data were recorded after each training session. During soccer matches, players’ RPEs were recorded. The internal TL was quantified daily by means of the session rating of perceived exertion (session-RPE) using Borg’s 0–10 scale. For GPS data, the selected running speed intensities (over 0.5 s time intervals) were 12–15.9 km/h; 16–19.9 km/h; 20–24.9 km/h; >25 km/h (sprint). Distances covered between 16 and 19.9 km/h, > 20 km/h and >25 km/h were significantly higher in U-19 compared to EAT over the course of the study (p =0.023, d = 0.243, small; p = 0.016, d = 0.298, small; and p = 0.001, d = 0.564, small, respectively). EAT players performed significantly fewer sprints per week compared to U-19 players (p = 0.002, d = 0.526, small). RPE was significantly higher in U-19 compared to EAT (p =0.001, d = 0.188, trivial). The external and internal measures of TL were significantly higher in the U-19 group compared to the EAT soccer players. In conclusion, the results obtained show that the training load is greater in U19 compared to EAT.


Author(s):  
Markus N.C. Williams ◽  
Vincent J. Dalbo ◽  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Aaron T. Scanlan

Purpose: To compare weekly training and game demands according to playing position in basketball players. Methods: A longitudinal, observational study was adopted. Semiprofessional, male basketball players categorized as backcourt (guards; n = 4) and frontcourt players (forwards/centers; n = 4) had their weekly workloads monitored across an entire season. External workload was determined using microsensors and included PlayerLoad™ (PL) and inertial movement analysis variables. Internal workload was determined using heart rate to calculate absolute and relative summated-heart-rate-zones workload and rating of perceived exertion (RPE) to calculate session-RPE workload. Comparisons between weekly training and game demands were made using linear mixed models and effect sizes in each positional group. Results: In backcourt players, higher relative PL (P = .04, very large) and relative summated-heart-rate-zones workload (P = .007, very large) were evident during training, while greater session-RPE workload (P = .001, very large) was apparent during games. In frontcourt players, greater PL (P < .001, very large), relative PL (P = .019, very large), peak PL intensities (P < .001, moderate), high-intensity inertial movement analysis events (P = .002, very large), total inertial movement analysis events (P < .001, very large), summated-heart-rate-zones workload (P < .001, very large), RPE (P < .001, very large), and session-RPE workload (P < .001, very large) were evident during games. Conclusions: Backcourt players experienced similar demands between training and games across several variables, with higher average workload intensities during training. Frontcourt players experienced greater demands across all variables during games than training. These findings emphasize the need for position-specific preparation strategies leading into games in basketball teams.


Author(s):  
Alexandru Nicolae Ungureanu ◽  
Corrado Lupo ◽  
Gennaro Boccia ◽  
Paolo Riccardo Brustio

Purpose: The primary aim of this study was to evaluate whether the internal (session rating of perceived exertion [sRPE] and Edwards heart-rate-based method) and external training load (jumps) affect the presession well-being perception on the day after (ie, +22 h), according to age and tactical position, in elite (ie, Serie A2) female volleyball training. Methods: Ten female elite volleyball players (age = 23 [4] y, height = 1.82 [0.04] m, body mass = 73.2 [4.9] kg) had their heart rate monitored during 13 team (115 individual) training sessions (duration: 101 [8] min). Mixed-effect models were applied to evaluate whether sRPE, Edwards method, and jumps were correlated (P ≤ .05) to Hooper index factors (ie, perceived sleep quality/disorders, stress level, fatigue, and delayed-onset muscle soreness) in relation to age and tactical position (ie, hitters, central blockers, opposites, and setters). Results: The results showed a direct relationship between sRPE (P < .001) and presession well-being perception 22 hours apart, whereas the relationship was the inverse for Edwards method internal training load. Age, as well as the performed jumps, did not affect the well-being perception of the day after. Finally, central blockers experienced a higher delayed-onset muscle soreness than hitters (P = .003). Conclusions: Findings indicated that female volleyball players’ internal training load influences the pretraining well-being status on the day after (+ 22 h). Therefore, coaches can benefit from this information to accurately implement periodization in a short-term perspective and to properly adopt recovery strategies in relation to the players’ well-being status.


2018 ◽  
Vol 13 (5) ◽  
pp. 804-809 ◽  
Author(s):  
Luciana S Decimoni ◽  
Victor M Curty ◽  
Livia Almeida ◽  
Alexander J Koch ◽  
Jeffrey M Willardson ◽  
...  

We investigated the effect of carbohydrate mouth rinsing on resistance exercise performance. Fifteen recreationally trained women (age 26 ± 4 y; height 1.61.9 ± 5.1 m; weight 59.5 ± 8.2 kg) completed two resistance exercise bouts consisting of three sets of five exercises (half-squat, leg press, bench press, military press, and seated row) to volitional fatigue with a 10 repetition-maximum load. Immediately prior to and during the middle of each exercise bout, subjects mouth rinsed for 10 s with 100 mL of either a 6% maltodextrin solution (CHO) or an artificially flavored solution (PLA) in a randomized, double-blind, counterbalanced fashion. Heart rate and perceived exertion were compared between conditions using a 2 (conditions) × 15 (time points) repeated measures ANOVA. Significant main effects were further analyzed using pairwise comparisons with Bonferroni post hoc tests. Total volume (exercises * sets * repetitions * load) between sessions was compared with a Student’s t-test. Statistical significance was set at p ≤ 0.05 level of confidence. The CHO resulted in more repetitions performed during half-squat, bench press, military press, and seated row, for a significantly greater (∼12%) total volume load lifted versus PLA ( p = 0.039, ES: 0.49). Rating of perceived exertion was also significantly lower in the CHO versus PLA ( p = 0.020, ES: 0.28). These data indicate that CHO mouth rinsing can enhance high-volume resistance exercise performance and lower ratings of perceived exertion.


2019 ◽  
Vol 14 (6) ◽  
pp. 829-840 ◽  
Author(s):  
Timothy J.H. Lathlean ◽  
Paul B. Gastin ◽  
Stuart V. Newstead ◽  
Caroline F. Finch

Purpose:To investigate associations between load (training and competition) and wellness in elite junior Australian Football players across 1 competitive season.Methods:A prospective cohort study was conducted during the 2014 playing season in 562 players from 9 teams. Players recorded their training and match intensities according to the session-rating-of-perceived-exertion (sRPE) method. Based on sRPE player loads, a number of load variables were quantified, including cumulative load and the change in load across different periods of time (including the acute-to-chronic load ratio). Wellness was quantified using a wellness index including sleep, fatigue, soreness, stress, and mood on a Likert scale from 1 to 5.Results:Players spent an average of 85 (21) min in each match and 65 (31) min per training session. Average match loads were 637 (232) arbitrary units, and average training loads were 352 (233) arbitrary units. Over the 24 wk of the 2014 season, overall wellness had a significant linear negative association with 1-wk load (B = −0.152; 95% confidence interval, −0.261 to −0.043;P = .006) and an inverseU-curve relationship with session load (B = −0.078; 95% confidence interval, 0.143 to 0.014;P = .018). Mood, stress, and soreness were all found to have associations with load.Conclusions:This study demonstrates that load (within a session and across the week) is important in managing the wellness of elite junior Australian Football players. Quantifying loads and wellness at this level will help optimize player management and has the potential to reduce the risk of adverse events such as injury.


Sign in / Sign up

Export Citation Format

Share Document