scholarly journals Workloads of forward and backline adolescent rugby players: a pilot study

2020 ◽  
Vol 32 (1) ◽  
pp. 1-5
Author(s):  
Devon Vernon Barnard ◽  
Lee Pote ◽  
Candice Christie

Background: There is minimal research on workloads of adolescent rugby players. Therefore, the main aim of this study was to determine the workloads placed on a cohort of South African adolescent rugby players (n = 17), during an in- season period. Methods: Session RPE ratings were collected daily, 30 minutes after the training session concluded, during an 11-week in- season period. The training load was calculated as the session ratings of perceived exertion multiplied by the session’s duration (min). Results: The main finding of the study was that the adolescents in this investigation had similar workloads to elite players but higher workloads than other studies on adolescent rugby players. The forwards (3311±939 arbitrary units; AU) had a higher workload than backline players (2851±1080 AU). There was no difference between forwards and backline players with regards to the acute:chronic workload ratio. Conclusion: Workloads are high in these adolescent players, particularly in the forwards, and are similar to the workloads of elite level rugby players.

2017 ◽  
Vol 12 (5) ◽  
pp. 665-676 ◽  
Author(s):  
Adam L Owen ◽  
Carlos Lago-Peñas ◽  
Miguel-Ángel Gómez ◽  
Bruno Mendes ◽  
Alexandre Dellal

Ensuring adequate levels of training and recovery at the elite level of professional soccer to maximise player performance has continued to drive the necessity to monitor the training load and physical training output of soccer players. The aim of this investigation was to analyse a training mesocycle whilst quantifying positional demands imposed on elite European soccer players. Sixteen players were assessed using global positioning systems and ratings of perceived exertion over a competitive training six-week mesocycle period. The positional demands and training loads were analysed in addition to match conditions (match location, match score) and player’s age. Results from the investigation revealed that typical daily training loads (i.e. total distance, high-intensity distance, sprint distance, average speed, ratings of perceived exertion) did not differ throughout each week of the mesocycle in-season period. Further analysis revealed training loads were significantly lower on match day-1 when compared to training loads on match day-2, match day-3 and match day-4 preceding a match ( p < 0.05). Significant differences in physical outputs were also found between match day-2, match day-3 and match day-4 highlighting a structured periodised tapered approach ( p < 0.05). Lower average speeds were reported in training post-successful matches compared to defeats ( p < 0.05), and more specifically when a match was played away compared to home fixtures ( p < 0.05). To conclude, practitioners can maintain a uniformed and structured training load mesocycle whilst inducing variation of the physical outputs during the microcycle phase. Additionally, the investigation also provides a tapering approach that may induce significant variation of the positional demands.


2019 ◽  
Vol 14 (1) ◽  
pp. 107-113 ◽  
Author(s):  
Lee Pote ◽  
Candice J Christie

Cricket players nowadays are faced with increased physical demands, and as a result, it is important to manage their workload, particularly to control and predict risk of injury. While this has been investigated at an elite level, few studies have looked at the workloads placed on adolescent cricket players. The purpose of this study was therefore to determine the workloads placed on school boy cricketers, specifically within a South African context. Twelve male school boy cricketers between the ages of 16 and 19 years participated in the study. Match and practice data were collected over a period of 74 days and included number of shuttles run (batsmen), number of deliveries bowled (bowlers) as well as central ratings of perceived exertion (RPE). Injury data were also collected. These data were then used to determine the acute:chronic (a:c) workload ratio (two-week rolling average) as well as session RPE (sRPE). Fast bowlers delivered more balls during matches compared to practices, whereas batsmen ran more shuttles at practices compared to matches. Session RPE was higher for matches compared to practices. There did not appear to be a relationship between workload and injury risk; however, this may have been due to the small sample size. It was concluded that it is important to monitor individual workloads of players. Also, intensities of practices need to be increased to match game demands. Lastly, the study design was effective and the methods used were found to be appropriate for a larger population.


Author(s):  
Sullivan Coppalle ◽  
Guillaume Ravé ◽  
Jason Moran ◽  
Iyed Salhi ◽  
Abderraouf Ben Abderrahman ◽  
...  

This study aimed to compare the training load of a professional under-19 soccer team (U-19) to that of an elite adult team (EAT), from the same club, during the in-season period. Thirty-nine healthy soccer players were involved (EAT [n = 20]; U-19 [n = 19]) in the study which spanned four weeks. Training load (TL) was monitored as external TL, using a global positioning system (GPS), and internal TL, using a rating of perceived exertion (RPE). TL data were recorded after each training session. During soccer matches, players’ RPEs were recorded. The internal TL was quantified daily by means of the session rating of perceived exertion (session-RPE) using Borg’s 0–10 scale. For GPS data, the selected running speed intensities (over 0.5 s time intervals) were 12–15.9 km/h; 16–19.9 km/h; 20–24.9 km/h; >25 km/h (sprint). Distances covered between 16 and 19.9 km/h, > 20 km/h and >25 km/h were significantly higher in U-19 compared to EAT over the course of the study (p =0.023, d = 0.243, small; p = 0.016, d = 0.298, small; and p = 0.001, d = 0.564, small, respectively). EAT players performed significantly fewer sprints per week compared to U-19 players (p = 0.002, d = 0.526, small). RPE was significantly higher in U-19 compared to EAT (p =0.001, d = 0.188, trivial). The external and internal measures of TL were significantly higher in the U-19 group compared to the EAT soccer players. In conclusion, the results obtained show that the training load is greater in U19 compared to EAT.


2005 ◽  
Vol 100 (2) ◽  
pp. 357-361 ◽  
Author(s):  
Meir Magal ◽  
Robert F. Zoeller

Ratings of perceived exertion (RPE) are used for exercise programming of cardiac rehabilitation patients, whenever it is difficult to use heart rate to set intensity due to medication or other factors. This investigation examined the physiological responses to two stepping exercise modes (upright and recumbent) at the same RPE. Analysis indicated significant physiological differences between the modes of exercise which may be mediated by postural differences. Specifically, the physiological responses to the recumbent exercise, but not the upright exercise, had the expected relationship with RPE, with recumbent stepping requiring less physiological effort than the upright stepping at the same RPE. As such, we cannot recommend with confidence that the prescription for upright exercise be made based on data from recumbent exercise or vice-versa.


2019 ◽  
Vol 34 (1) ◽  
pp. 1-5 ◽  
Author(s):  
Brenton Surgenor ◽  
Matthew Wyon

OBJECTIVE: The session rating of perceived exertion (session-RPE) is a practical and non-invasive method that allows a quantification of internal training load (ITL) in individual and team sports. As yet, no study has investigated its construct validity in dance. This study examines the convergent validity between the session-RPE method and an objective heart rate (HR)-based method of quantifying the similar ITL in vocational dance students during professional dance training. METHODS: Ten dance students (4 male, 20±1.16 yrs; 6 female, 20±0.52 yrs) participated in this study. During a normal week of training, session-RPE and HR data were recorded in 96 individual sessions. HR data were analysed using Edwards-TL method. Correlation analysis was used to evaluate the convergent validity between the session-RPE and Edwards-TL methods for assessing ITL in a variety of training modes (contemporary, ballet, and rehearsal). RESULTS: The overall correlation between individual session-RPE and Edwards-TL was r=0.72, p<0.0001, suggesting there was a statistically significantly strong positive relationship between session-RPE and Edwards-TL. This trend was observed across all the training modes: rehearsal sessions (r=0.74, p=0.001), contemporary (r=0.60, p=0.001), and ballet (r=0.46, p=0.018) sessions. CONCLUSIONS: This study shows that session-RPE can be considered as a valid method to assess ITL for vocational dance students, and that notably there is some variation between session-RPE and HR-based TL in different dance activities.


2018 ◽  
Vol 13 (1) ◽  
pp. 95-101 ◽  
Author(s):  
Andrew D. Govus ◽  
Aaron Coutts ◽  
Rob Duffield ◽  
Andrew Murray ◽  
Hugh Fullagar

Context:The relationship between pretraining subjective wellness and external and internal training load in American college football is unclear.Purpose:To examine the relationship of pretraining subjective wellness (sleep quality, muscle soreness, energy, wellness Z score) with player load and session rating of perceived exertion (s-RPE-TL) in American college football players.Methods:Subjective wellness (measured using 5-point, Likert-scale questionnaires), external load (derived from GPS and accelerometry), and s-RPE-TL were collected during 3 typical training sessions per week for the second half of an American college football season (8 wk). The relationship of pretraining subjective wellness with player load and s-RPE training load was analyzed using linear mixed models with a random intercept for athlete and a random slope for training session. Standardized mean differences (SMDs) denote the effect magnitude.Results:A 1-unit increase in wellnessZscore and energy was associated with trivial 2.3% (90% confidence interval [CI] 0.5, 4.2; SMD 0.12) and 2.6% (90% CI 0.1, 5.2; SMD 0.13) increases in player load, respectively. A 1-unit increase in muscle soreness (players felt less sore) corresponded to a trivial 4.4% (90% CI −8.4, −0.3; SMD −0.05) decrease in s-RPE training load.Conclusion:Measuring pretraining subjective wellness may provide information about players’ capacity to perform in a training session and could be a key determinant of their response to the imposed training demands American college football. Hence, monitoring subjective wellness may aid in the individualization of training prescription in American college football players.


2015 ◽  
Vol 10 (6) ◽  
pp. 767-773 ◽  
Author(s):  
Alexandre Moreira ◽  
Tom Kempton ◽  
Marcelo Saldanha Aoki ◽  
Anita C. Sirotic ◽  
Aaron J. Coutts

Purpose: To examine the impact of varying between-matches microcycles on training characteristics (ie, intensity, duration, and load) in professional rugby league players and to report on match load related to these between-matches microcycles. Methods: Training-load data were collected during a 26-wk competition period of an entire season. Training load was measured using the session rating of perceived exertion (session-RPE) method for every training session and match from 44 professional rugby league players from the same National Rugby League team. Using the category-ratio 10 RPE scale, the training intensity was divided into 3 zones (low <4 AU, moderate ≥4-≤7 AU, and high >7 AU). Three different-length between-matches recovery microcycles were used for analysis: 5−6 d, 7−8 d, and 9−10 d. Results: A total of 3848 individual sessions were recorded. During the shorter-length between-matches microcycles (5−6 d), significantly lower training load was observed. No significant differences for subsequent match load or intensity were identified between the various match recovery periods. Overall, 16% of the training sessions were completed at the low-intensity zone, 61% at the moderate-intensity zone, and 23% at the high-intensity zone. Conclusions: The findings demonstrate that rugby league players undertake higher training load as the length of between-matches microcycles is increased. The majority of in-season training of professional rugby league players was at moderate intensity, and a polarized approach to training that has been reported in elite endurance athletes does not occur in professional rugby league.


2018 ◽  
Vol 10 (1) ◽  
pp. 157-162 ◽  
Author(s):  
Andrew Scott Perrotta ◽  
Darren E. R. Warburton

Abstract Study aim: Recent evidence has revealed a reduction in the strength of correlation between ratings of perceived exertion and a heart rate (HR) derived training load in elite field hockey players during competition. These competitive periods involve sustained levels of cardiovascular performance coupled with considerable time performing above the anaerobic threshold. As such, the purpose of this investigation was to examine the magnitude of correlation between ratings of perceived exertion and time spent above threshold and two HR derived training loads.Material and methods: Seventeen (n = 17) international caliber female field hockey players competing as a national team were monitored over four matches during a seven-day competition period within the 2016 Olympic Cycle. Cardiovascular indices of exercise intensity were derived from HR dynamics and were quantified through estimating time spent above anaerobic threshold (LT2), the Edwards training load model (TLED) and the Polar Training Load (TLPOL). Sessional ratings of perceived exertion (sRPE) were recorded after each match.Results: 64 samples were recorded for analysis. HR derived (TLED& TL POL) and sRPE training loads remained comparable between matches. A large correlation (p = 0.01) was observed between sRPE and each heart rate derived training load (TLED& TLPOL). An unremarkable relationship (p = 0.06) was revealed between time spent above LT2 and sRPE.Conclusions: Our results demonstrate HR derived training loads (TLPOL& TLED) exhibit a stronger correlation with sRPE than time spent above LT2 in elite field hockey players during competition.


2020 ◽  
Vol 55 (9) ◽  
pp. 984-993
Author(s):  
Brett Pexa ◽  
Eric D. Ryan ◽  
J. Troy Blackburn ◽  
Darin A. Padua ◽  
J. Craig Garrison ◽  
...  

Context A baseball-specific training load may influence strength or glenohumeral range of motion, which are related to baseball injuries. Glenohumeral reach tests and grip strength are clinical assessments of shoulder range of motion and upper extremity strength, respectively. Objective To examine changes in glenohumeral reach test performance and grip strength between dominant and nondominant limbs and high, moderate, and low baseball-specific training-load groups. Design Repeated-measures study. Setting University laboratory and satellite clinic. Patients or Other Participants Collegiate baseball athletes (n = 18, age = 20.1 ± 1.3 years, height = 185.0 ± 6.5 cm, mass = 90.9 ± 10.2 kg). Main Outcome Measure(s) Participants performed overhead reach tests (OHRTs), behind-the-back reach tests (BBRTs), and grip strength assessments using the dominant and nondominant limbs every 4 weeks for 16 weeks. Percentage change scores were calculated between testing times. After each training session, participants provided their duration of baseball activity, throw count, and body-specific and arm-specific ratings of perceived exertion. We classified them in the high, moderate, or low training-load group based on each training-load variable: body-specific acute:chronic workload ratio (ACWR), arm-specific ACWR, body-specific cumulative load, and arm-specific cumulative load. Mixed models were used to compare training-load groups and limbs. Results The arm-specific ACWR group demonstrated as main effect for OHRT (F = 7.70, P = .001), BBRT (F = 4.01, P = .029), and grip strength (F = 8.89, P &lt; .001). For the OHRT, the moderate training-load group demonstrated a 10.8% greater increase than the high group (P = .004) and a 13.2% greater increase than the low group (P &lt; .001). For the BBRT, the low training-load group had a 10.1% greater increase than the moderate group (P = .011). For grip strength, the low training-load group demonstrated a 12.1% greater increase than the high group (P = .006) and a 17.7% greater increase than the moderate group (P &lt; .001). Conclusions Arm-specific ACWR was related to changes in clinical assessments of range of motion and strength. Clinicians may use arm-specific ACWR to indicate when a baseball athlete's physical health is changing.


Author(s):  
Bruno Ribeiro ◽  
Ana Pereira ◽  
Pedro P. Neves ◽  
António C. Sousa ◽  
Ricardo Ferraz ◽  
...  

The current study aims to verify the effects of three specific warm-ups on squat and bench press resistance training. Forty resistance-trained males (19–30 years) performed 3 × 6 repetitions with 80% of maximal dynamic strength (designated as training load) after one of the following warm-ups (48 h between): (i) 2 × 6 repetitions with 40% and 80% of the training load (WU), (ii) 6 × 80% of training load (WU80), or (iii) 6 × 40% of the training load (WU40). Mean propulsive velocity (MPV), velocity loss (VL), peak velocity (PV), time to achieve PV, power, work, heart rates, and ratings of perceived exertion were analyzed. In squat exercises, higher MPV were found in WU80 compared with WU40 (2nd set: 0.69 ± 0.09 vs. 0.67 ± 0.06 m.s−1, p = 0.02, ES = 0.80; 3rd set: 0.68 ± 0.09 vs. 0.66 ± 0.07 m.s−1, p = 0.05, ES = 0.51). In bench press exercises, time to PV was lower in WU compared with WU40 (1st set: 574.77 ± 233.46 vs. 694.50 ± 211.71 m.s−1, p < 0.01, ES = 0.69; 2nd set: 533.19 ± 272.22 vs. 662.31 ± 257.51 m.s−1, p = 0.04, ES = 0.43) and total work was higher (4749.90 ± 1312.99 vs. 4631.80 ± 1355.01 j, p = 0.01, ES = 0.54). The results showed that force outputs were mainly optimized by WU80 in squat training and by WU in bench press training. Moreover, warming-up with few repetitions and low loads is not enough to optimize squat and bench press performances.


Sign in / Sign up

Export Citation Format

Share Document