Internal and External Training Workload Quantification in 4 Experienced Paracanoeing Athletes

2021 ◽  
pp. 1-7
Author(s):  
Frederico Ribeiro Neto ◽  
Ramires Alsamir Tibana ◽  
Jefferson Rodrigues Dorneles ◽  
Rodrigo Rodrigues Gomes Costa

Context: Paracanoeing is one of the adapted sports eligible for different motor impairments. The acute:chronic workload ratio (ACWR) is an index between acute and chronic training workload. However, no studies have analyzed this variable in paracanoeing, relating it with training recovery markers. Objective: This study aimed to quantify the internal (session rating of perceived exertion) and external (distance traveled and total training time) training workloads in 4 experienced paracanoe athletes over 9 months and 5 canoeing events. Design: Cross-sectional study. Setting: Rehabilitation Hospital Network, Paralympic Program. Participants: Four experienced paracanoe athletes participated in 36 weeks of training for 5 events. Main Outcomes Measures: The daily and weekly training workload, monotony, ACWR, distance, and total training time were described for all the training phases. The perceived recovery status scale (PRS) and medicine ball throw (MBT) were used to quantify recovery. Results: The average daily and weekly training workload varied from 213.1 to 239.3 and 767.3 to 1036.8 arbitrary units, respectively. Average ACWR results ranged from 0.96 to 1.10 in the 4 athletes, findings that were outside the safety zone in 38% of the training weeks. All the correlations between MBT and PRS were classified as weak (ρ between .20 and .39, P > .05). ACWR showed a very weak correlation with MBT and moderately and highly significant correlations with PRS in 2 athletes, respectively. Conclusions: The training workloads of 4 paracanoe athletes may serve as a comparison with other periodization models. Pretraining recovery assessments (MBT and PRS) exhibited a low, nonsignificant correlation. However, ACWR correlated significantly with PRS in 2 athletes and might be a suitable tool for daily training adjustments.

Folia Medica ◽  
2021 ◽  
Vol 63 (4) ◽  
pp. 502-510
Author(s):  
Oyéné Kossi ◽  
Justine Lacroix ◽  
Maxence Compagnat ◽  
Jean Christophe Daviet ◽  
Stéphane Mandigout

Aim: To test the validity of Borg’s 6–20 rating of perceived exertion scale in assessing the exertion intensity over a multi-activity session in young and older adults.Materials and methods: This cross-sectional study included 56 healthy participants. All participants underwent a single session of activities including working on a computer, treadmill walking, biking, and treadmill running. Results: Results showed a non-significant correlation between the overall perceived exertion and energy expenditure in young people (Rho=−0.05, p=0.75) and in older adults (Rho=−0.05, p=0.78) for the whole session. However, results showed that older adults perceived significantly higher exertion compared to young people while working on a computer, walking and running, whereas they presented lower energy expenditure while resting and working on a computer. Conclusions: Combining the perceived exertion method with other commonly used methods to estimate exercise intensity would be recommended for older adults.


2013 ◽  
Vol 21 (3) ◽  
pp. 260-271
Author(s):  
Karin M. Volkers ◽  
Tim C.W. van Dijk ◽  
Laura H. Eggermont ◽  
A. Peter Hollander ◽  
Erik J.A. Scherder

Introduction:The American College of Sports Medicine prescribes regular performance of at least moderate-intensity physical activity for healthy aging. This study examined whether 1 session of 30 min of chair-assisted exercises for the elderly meets this intensity criterion.Method:This cross-sectional study included 47 cognitively healthy volunteers (mean age 84 years). During the performance of 30 min of chair-assisted exercises the authors determined oxygen uptake (VO2), carbon dioxide production, heart rate (HR), and rating of perceived exertion (RPE). These measures were expressed as a percentage of the estimated maximal VO2 (VO2max) and the estimated maximal HR (HRmax) and estimated as metabolic equivalent units (METs).Results:Participants performed chair-assisted exercises at 61.0% ± 14.7% of VO2max, 67.6% ± 11.3% HRmax, 3.9 ± 0.9 METs, and 13.1 ± 2.1 RPE.Conclusions:The intensity of these chair-assisted exercises is at least moderate for older adults, which is necessary for healthy aging.


2020 ◽  
Vol 15 (6) ◽  
pp. 853-861
Author(s):  
Claire A. Molinari ◽  
Florent Palacin ◽  
Luc Poinsard ◽  
Véronique L. Billat

Purpose: To validate a new perceptually regulated, self-paced maximal oxygen consumption field test (the Running Advisor Billat Training [RABIT] test) that can be used by recreational runners to define personalized training zones. Design: In a cross-sectional study, male and female recreational runners (N = 12; mean [SD] age = 43 [8] y) completed 3 maximal exercise tests (2 RABIT tests and a University of Montreal Track Test), with a 48-hour interval between tests. Methods: The University of Montreal Track Test was a continuous, incremental track test with a 0.5-km·h−1 increment every minute until exhaustion. The RABIT tests were conducted at intensities of 11, 14, and 17 on the rating of perceived exertion (RPE) scale for 10, 5, and 3 minutes, respectively, with a 1-minute rest between efforts. Results: The 2 RABIT tests and the University of Montreal Track Test gave similar mean (SD) maximal oxygen consumption values (53.9 [6.4], 56.4 [9.1], and 55.4 [7.6] mL·kg−1·min−1, respectively, P = .722). The cardiorespiratory and speed responses were reliable as a function of the running intensity (RPE: 11, 14, and 17) and the relative time point for each RPE stage. Indeed, the oxygen consumption, heart rate, ventilation, and speed values did not differ significantly when the running time was expressed as a relative duration of 30%, 60%, or 90% (ie, at 3, 6, and 9 min of a 10-min effort at RPE 11; P = .997). Conclusions: The results demonstrate that the RABIT test is a valid method for defining submaximal and maximal training zones in recreational runners.


2020 ◽  
Vol 100 (3) ◽  
pp. 438-446
Author(s):  
Antonio Ignacio Cuesta-Vargas ◽  
Jena Buchan ◽  
Bella Pajares ◽  
Emilio Alba ◽  
Manuel Trinidad-Fernández ◽  
...  

Abstract Background Survivors of breast cancer commonly report functional limitations, including cancer-related fatigue (CRF) and decreased aerobic capacity. One key gap is addressing the 3 energy systems (aerobic, anaerobic lactic, and alactic), requiring assessment to establish a baseline exercise intensity and duration. Objective This study examined the feasibility of energy system–based assessment, also providing descriptive values for assessment performance in this population. Design This was a cross-sectional study. Methods Seventy-two posttreatment survivors of breast cancer were recruited. Following a baseline musculoskeletal assessment, women attempted 3 energy system assessments: submaximal aerobic (multistage treadmill), anaerobic alactic (30-second sit-to-stand [30-STS]), and anaerobic lactic (adapted burpees). Heart rate (HR) and rating of perceived exertion (RPE) were recorded. Secondary outcomes included body composition, CRF, and upper- and lower-limb functionality. Results Seventy of 72 participants performed the 30-STS and 30 completed the adapted burpees task. HR and RPE specific to each task were correlated, reflecting increased intensity. Women reported low-moderate levels of CRF scores (3% [2.1]) and moderate-high functionality levels (upper-limb: 65.8% [23.3]; lower-limb: 63.7% [34.7]). Limitations All survivors of breast cancer had relatively low levels of CRF and moderate functioning. Additionally, on average, participants were classified as “overweight” based on BMI. Conclusion This study is the first to our knowledge to demonstrate feasibility of energy system assessment in survivors of breast cancer. Using a combination of HR and RPE, as well as baseline assessment of each energy system, clinicians may improve ability to prescribe personalized exercise and give patients greater ability to self-monitor intensity and progress.


2021 ◽  
Vol 11 (11) ◽  
pp. 4871
Author(s):  
José E. Teixeira ◽  
Pedro Forte ◽  
Ricardo Ferraz ◽  
Miguel Leal ◽  
Joana Ribeiro ◽  
...  

Monitoring the training load in football is an important strategy to improve athletic performance and an effective training periodization. The aim of this study was two-fold: (1) to quantify the weekly training load and recovery status variations performed by under-15, under-17 and under-19 sub-elite young football players; and (2) to analyze the influence of age, training day, weekly microcycle, training and playing position on the training load and recovery status. Twenty under-15, twenty under-17 and twenty under-19 players were monitored over a 2-week period during the first month of the 2019–2020 competitive season. Global positioning system technology (GPS) was used to collect external training loads: total distance covered, average speed, maximal running speed, relative high-speed running distance, high metabolic load distance, sprinting distance, dynamic stress load, accelerations and decelerations. Internal training load was monitored using ratings of perceived exertion (RPE) and session rating of perceived exertion (sRPE). Recovery status was obtained using the total quality recovery (TQR) scale. The results show an age-related influence for external training load (p ≤ 0.001; d = 0.29–0.86; moderate to strong effect), internal training load (p ≤ 0.001, d = 0.12–0.69; minimum to strong effect) and recovery status (p ≤ 0.001, d = 0.59; strong effect). The external training load presented differences between training days (p < 0.05, d = 0.26–0.95; moderate to strong effect). The playing position had a minimum effect on the weekly training load (p < 0.05; d = 0.06–0.18). The weekly microcycle had a moderate effect in the TD (p < 0.05, d = 0.39), RPE (p < 0.05; d = 0.35) and sRPE (p < 0.05, d = 0.35). Interaction effects were found between the four factors analyzed for deceleration (F = 2.819, p = 0.017) and between inter-day, inter-week and age for total covered distance (F = 8.342, p = 0.008). This study provided specific insights about sub-elite youth football training load and recovery status to monitor training environments and load variations. Future research should include a longer monitoring period to assess training load and recovery variations across different season phases.


Author(s):  
Sullivan Coppalle ◽  
Guillaume Ravé ◽  
Jason Moran ◽  
Iyed Salhi ◽  
Abderraouf Ben Abderrahman ◽  
...  

This study aimed to compare the training load of a professional under-19 soccer team (U-19) to that of an elite adult team (EAT), from the same club, during the in-season period. Thirty-nine healthy soccer players were involved (EAT [n = 20]; U-19 [n = 19]) in the study which spanned four weeks. Training load (TL) was monitored as external TL, using a global positioning system (GPS), and internal TL, using a rating of perceived exertion (RPE). TL data were recorded after each training session. During soccer matches, players’ RPEs were recorded. The internal TL was quantified daily by means of the session rating of perceived exertion (session-RPE) using Borg’s 0–10 scale. For GPS data, the selected running speed intensities (over 0.5 s time intervals) were 12–15.9 km/h; 16–19.9 km/h; 20–24.9 km/h; >25 km/h (sprint). Distances covered between 16 and 19.9 km/h, > 20 km/h and >25 km/h were significantly higher in U-19 compared to EAT over the course of the study (p =0.023, d = 0.243, small; p = 0.016, d = 0.298, small; and p = 0.001, d = 0.564, small, respectively). EAT players performed significantly fewer sprints per week compared to U-19 players (p = 0.002, d = 0.526, small). RPE was significantly higher in U-19 compared to EAT (p =0.001, d = 0.188, trivial). The external and internal measures of TL were significantly higher in the U-19 group compared to the EAT soccer players. In conclusion, the results obtained show that the training load is greater in U19 compared to EAT.


BJS Open ◽  
2021 ◽  
Vol 5 (Supplement_1) ◽  
Author(s):  
◽  
Joshua Clements

Abstract Background The COVID-19 pandemic has resulted in dynamic changes to healthcare delivery. Surgery as a specialty has been significantly affected and with that the delivery of surgical training. Method This national, collaborative, cross sectional study comprising 13 surgical trainee associations distributed a pan surgical specialty survey on the COVID-19 impact on surgical training over a 4-week period (11th May - 8th June 2020). The survey was voluntary and open to medical students and surgical trainees of all specialties and training grades. All aspects of training were qualitatively assessed. This study was reported according to STROBE guidelines. Results 810 completed responses were analysed. (M401: F 390) with representation from all deaneries and training grades. 41% of respondents (n = 301) were redeployed with 74% (n = 223) redeployed &gt; 4 weeks. Complete loss of training was reported in elective operating (69.5% n = 474), outpatient activity (67.3%, n = 457), Elective endoscopy (69.5% n = 246) with &gt; 50% reduction in training time reported in emergency operating (48%, n = 326) and completion of work-based assessments (WBA) (46%, n = 309). 81% (n = 551) reported course cancellations and departmental and regional teaching programmes were cancelled without rescheduling in 58% and 60% of cases respectively. A perceived lack of Elective operative exposure and completions of WBA’s were the primary reported factor affecting potential training progression. Overall, &gt; 50% of trainees (n = 377) felt they would not meet the competencies required for that training period. Conclusion This study has demonstrated a perceived negative impact on numerous aspects of surgical training affecting all training specialties and grades.


2016 ◽  
Vol 96 (11) ◽  
pp. 1773-1781
Author(s):  
Bethany J. Wilcox ◽  
Megan M. Wilkins ◽  
Benjamin Basseches ◽  
Joel B. Schwartz ◽  
Karen Kerman ◽  
...  

Abstract Background Challenges with any therapeutic program for children include the level of the child's engagement or adherence. Capitalizing on one of the primary learning avenues of children, play, the approach described in this article is to develop therapeutic toy and game controllers that require specific and repetitive joint movements to trigger toy/game activation. Objective The goal of this study was to evaluate a specially designed wrist flexion and extension play controller in a cohort of children with upper extremity motor impairments (UEMIs). The aim was to understand the relationship among controller play activity, measures of wrist and forearm range of motion (ROM) and spasticity, and ratings of fun and difficulty. Design This was a cross-sectional study of 21 children (12 male, 9 female; 4–12 years of age) with UEMIs. Methods All children participated in a structured in-clinic play session during which measurements of spasticity and ROM were collected. The children were fitted with the controller and played with 2 toys and 2 computer games for 5 minutes each. Wrist flexion and extension motion during play was recorded and analyzed. In addition, children rated the fun and difficulty of play. Results Flexion and extension goal movements were repeatedly achieved by children during the play session at an average frequency of 0.27 Hz. At this frequency, 15 minutes of play per day would result in approximately 1,700 targeted joint motions per week. Play activity was associated with ROM measures, specifically supination, but toy perception ratings of enjoyment and difficulty were not correlated with clinical measures. Limitations The reported results may not be representative of children with more severe UEMIs. Conclusions These outcomes indicate that the therapeutic controllers elicited repetitive goal movements and were adaptable, enjoyable, and challenging for children of varying ages and UEMIs.


Author(s):  
Markus N.C. Williams ◽  
Vincent J. Dalbo ◽  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Aaron T. Scanlan

Purpose: To compare weekly training and game demands according to playing position in basketball players. Methods: A longitudinal, observational study was adopted. Semiprofessional, male basketball players categorized as backcourt (guards; n = 4) and frontcourt players (forwards/centers; n = 4) had their weekly workloads monitored across an entire season. External workload was determined using microsensors and included PlayerLoad™ (PL) and inertial movement analysis variables. Internal workload was determined using heart rate to calculate absolute and relative summated-heart-rate-zones workload and rating of perceived exertion (RPE) to calculate session-RPE workload. Comparisons between weekly training and game demands were made using linear mixed models and effect sizes in each positional group. Results: In backcourt players, higher relative PL (P = .04, very large) and relative summated-heart-rate-zones workload (P = .007, very large) were evident during training, while greater session-RPE workload (P = .001, very large) was apparent during games. In frontcourt players, greater PL (P < .001, very large), relative PL (P = .019, very large), peak PL intensities (P < .001, moderate), high-intensity inertial movement analysis events (P = .002, very large), total inertial movement analysis events (P < .001, very large), summated-heart-rate-zones workload (P < .001, very large), RPE (P < .001, very large), and session-RPE workload (P < .001, very large) were evident during games. Conclusions: Backcourt players experienced similar demands between training and games across several variables, with higher average workload intensities during training. Frontcourt players experienced greater demands across all variables during games than training. These findings emphasize the need for position-specific preparation strategies leading into games in basketball teams.


2019 ◽  
Vol 14 (6) ◽  
pp. 847-849 ◽  
Author(s):  
Pedro Figueiredo ◽  
George P. Nassis ◽  
João Brito

Purpose: To quantify the association between salivary secretory immunoglobulin A (sIgA) and training load in elite football players. Methods: Data were obtained on 4 consecutive days during the preparation camp for the Rio 2016 Olympic Games. Saliva samples of 18 elite male football players were collected prior to breakfast. The session rating of perceived exertion (s-RPE) and external training-load metrics from global positioning systems (GPS) were recorded. Within-subject correlation coefficients between training load and sIgA concentration, and magnitude of relationships, were calculated. Results: sIgA presented moderate to large negative correlations with s-RPE (r = −.39), total distance covered (r = −.55), accelerations (r = −.52), and decelerations (r = −.48). Trivial to small associations were detected between sIgA and distance covered per minute (r = .01), high-speed distance (r = −.23), and number of sprints (r = −.18). sIgA displayed a likely moderate decrease from day 1 to day 2 (d = −0.7) but increased on day 3 (d = 0.6). The training-load variables had moderate to very large rises from day 1 to day 2 (d = 0.7 to 3.2) but lowered from day 2 to day 3 (d = −5.0 to −0.4), except for distance per minute (d = 0.8) and sprints (unclear). On day 3, all training-load variables had small to large increments compared with day 1 (d = 0.4 to 1.5), except for accelerations (d = −0.8) and decelerations (unclear). Conclusions: In elite football, sIgA might be more responsive to training volume than to intensity. External load such as GPS-derived variables presented stronger association with sIgA than with s-RPE. sIgA can be used as an additional objective tool in monitoring football players.


Sign in / Sign up

Export Citation Format

Share Document