Application of heart rate and rating of perceived exertion scale in monitoring physical fatigue in divers during heliox saturation diving

2015 ◽  
Vol 36 (9) ◽  
pp. 978
Author(s):  
Yu-min HUANG ◽  
Ming-yue ZHANG ◽  
Rui-yong CHEN ◽  
Xian-lun HAN ◽  
Li CHEN ◽  
...  
Author(s):  
Alice Iannaccone ◽  
Daniele Conte ◽  
Cristina Cortis ◽  
Andrea Fusco

Internal load can be objectively measured by heart rate-based models, such as Edwards’ summated heart rate zones, or subjectively by session rating of perceived exertion. The relationship between internal loads assessed via heart rate-based models and session rating of perceived exertion is usually studied through simple correlations, although the Linear Mixed Model could represent a more appropriate statistical procedure to deal with intrasubject variability. This study aimed to compare conventional correlations and the Linear Mixed Model to assess the relationships between objective and subjective measures of internal load in team sports. Thirteen male youth beach handball players (15.9 ± 0.3 years) were monitored (14 training sessions; 7 official matches). Correlation coefficients were used to correlate the objective and subjective internal load. The Linear Mixed Model was used to model the relationship between objective and subjective measures of internal load data by considering each player individual response as random effect. Random intercepts were used and then random slopes were added. The likelihood-ratio test was used to compare statistical models. The correlation coefficient for the overall relationship between the objective and subjective internal data was very large (r = 0.74; ρ = 0.78). The Linear Mixed Model using both random slopes and random intercepts better explained (p < 0.001) the relationship between internal load measures. Researchers are encouraged to apply the Linear Mixed Models rather than correlation to analyze internal load relationships in team sports since it allows for the consideration of the individuality of players.


2021 ◽  
Vol 18 (1) ◽  
Author(s):  
Kelvin Euton Oliveira Carmo ◽  
Diego Ignácio Valenzuela Pérez ◽  
Charles Nardelli Valido ◽  
Jymmys Lopes dos Santos ◽  
Bianca Miarka ◽  
...  

Abstract Background Nutritional ergogenic aids are foods or nutrients that can improve physical performance. Among these foods with ergogenic properties, caffeine has shown that it can increase the fat catabolism, strength, and improve the cognition and time reaction of an athlete, therefore, it is hoped that it can improve the performance of judokas. This study through a double-blind crossover (supplement X placebo) protocol, investigated the effects caffeine supplementation (single capsule containing 5 mg/kg body mass intake 60 min before the session) on biochemical, anthropometrical, physical, subjective and hemodynamic variables measured before, during and after two typical judo trainingcxs sessions (120-min: 40-min of gymnastics; 40-min of specific technics and; 40-min of judo combat). Methods 8 high-level athletes (21.4 ± 2.0 years; 83.6 ± 15.2 kg; 1.8 ± 0.1 m; 17.9 ± 7.0 Fat%) were evaluated before and after each training for body mass, hydration, upper and lower limb power, performance in the special judo fitness test (SJFT), free fatty acids (FFA) in plasma, uric acid, glucose, lactate, heart rate, and pain. In addition, heart rate, FFA in plasma, uric acid, glucose, lactate, rating of perceived exertion and pain were assessed during the training. Results At 120 min, supplementation resulted in a higher concentration of plasma FFA (1.5 ± 0.5 vs. 1.0 ± 0.3 mmol/L; p = 0.047) and lactate (4.9 ± 1.8 vs. 3.0 ± 1.2 mmol/L; p = 0.047), and a lower concentration of uric acid (5.4 ± 0.9 vs. 7.0 ± 1.5 mg/dL; p = 0.04). Supplementation also resulted in performance maintenance (fatigue index) in the SJFT (Δ0.3 ± 2.0 vs Δ1.7 ± 2.5, for caffeine and placebo respectively, p = 0.046). No adverse effects were observed. Conclusion Based on the applied dose, intake time, and sample of this study, we can conclude that caffeine produces an ergogenic biochemical effect, and improves performance in judo athletes.


2016 ◽  
Vol 11 (6) ◽  
pp. 707-714 ◽  
Author(s):  
Benoit Capostagno ◽  
Michael I. Lambert ◽  
Robert P. Lamberts

Finding the optimal balance between high training loads and recovery is a constant challenge for cyclists and their coaches. Monitoring improvements in performance and levels of fatigue is recommended to correctly adjust training to ensure optimal adaptation. However, many performance tests require a maximal or exhaustive effort, which reduces their real-world application. The purpose of this review was to investigate the development and use of submaximal cycling tests that can be used to predict and monitor cycling performance and training status. Twelve studies met the inclusion criteria, and 3 separate submaximal cycling tests were identified from within those 12. Submaximal variables including gross mechanical efficiency, oxygen uptake (VO2), heart rate, lactate, predicted time to exhaustion (pTE), rating of perceived exertion (RPE), power output, and heart-rate recovery (HRR) were the components of the 3 tests. pTE, submaximal power output, RPE, and HRR appear to have the most value for monitoring improvements in performance and indicate a state of fatigue. This literature review shows that several submaximal cycle tests have been developed over the last decade with the aim to predict, monitor, and optimize cycling performance. To be able to conduct a submaximal test on a regular basis, the test needs to be short in duration and as noninvasive as possible. In addition, a test should capture multiple variables and use multivariate analyses to interpret the submaximal outcomes correctly and alter training prescription if needed.


2017 ◽  
Vol 39 (02) ◽  
pp. 115-123 ◽  
Author(s):  
Manuel Garnacho-Castaño ◽  
Raúl Domínguez ◽  
Arturo Muñoz González ◽  
Raquel Feliu-Ruano ◽  
Noemí Serra-Payá ◽  
...  

AbstractThe present study aimed to compare two fitness-training methodologies, instability circuit resistance training (ICRT) versus traditional circuit resistance training (TCRT), applying an experimental model of exercise prescription controlling and modulating exercise load using the Borg rating of perceived exertion. Forty-four healthy young adults age (21.6±2.3 years) were randomly assigned to three groups: TCRT (n=14), ICRT (n=14) and a control group (n=16). Strength and cardiorespiratory tests were chosen to evaluate cardiorespiratory and muscular fitness before and after the training program. In cardiorespiratory data, a significant difference was observed for the time effect in VO2max, peak heart rate, peak velocity, and heart rate at anaerobic threshold intensity (p<0.05) in the experimental groups. In strength variables, a significant Group x Time interaction effect was detected in 1RM, in mean propulsive power, and in peak power (p≤0.01) in the back squat exercise. In the bench press exercise, a significant time effect was detected in 1RM, in mean propulsive power, and in peak power, and a Group x Time interaction in peak power (all p<0.05). We can conclude that applying an experimental model of exercise prescription using RPE improved cardiorespiratory and muscular fitness in healthy young adults in both experimental groups.


Author(s):  
Markus N.C. Williams ◽  
Vincent J. Dalbo ◽  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Aaron T. Scanlan

Purpose: To compare weekly training and game demands according to playing position in basketball players. Methods: A longitudinal, observational study was adopted. Semiprofessional, male basketball players categorized as backcourt (guards; n = 4) and frontcourt players (forwards/centers; n = 4) had their weekly workloads monitored across an entire season. External workload was determined using microsensors and included PlayerLoad™ (PL) and inertial movement analysis variables. Internal workload was determined using heart rate to calculate absolute and relative summated-heart-rate-zones workload and rating of perceived exertion (RPE) to calculate session-RPE workload. Comparisons between weekly training and game demands were made using linear mixed models and effect sizes in each positional group. Results: In backcourt players, higher relative PL (P = .04, very large) and relative summated-heart-rate-zones workload (P = .007, very large) were evident during training, while greater session-RPE workload (P = .001, very large) was apparent during games. In frontcourt players, greater PL (P < .001, very large), relative PL (P = .019, very large), peak PL intensities (P < .001, moderate), high-intensity inertial movement analysis events (P = .002, very large), total inertial movement analysis events (P < .001, very large), summated-heart-rate-zones workload (P < .001, very large), RPE (P < .001, very large), and session-RPE workload (P < .001, very large) were evident during games. Conclusions: Backcourt players experienced similar demands between training and games across several variables, with higher average workload intensities during training. Frontcourt players experienced greater demands across all variables during games than training. These findings emphasize the need for position-specific preparation strategies leading into games in basketball teams.


2021 ◽  
Author(s):  
Étienne Chassé ◽  
Daniel Théoret ◽  
Martin P Poirier ◽  
François Lalonde

ABSTRACT Introduction Members of the Canadian Armed Forces (CAF) are required to meet the minimum standards of the Fitness for Operational Requirements of CAF Employment (FORCE) job-based simulation test (JBST) and must possess the capacity to perform other common essential tasks. One of those tasks is to perform basic fire management tasks during fire emergencies to mitigate damage and reduce the risk of injuries and/or death until professional firefighters arrive at the scene. To date however, the physiological demands of common firefighting tasks have mostly been performed on professional firefighters, thus rendering the transferability of the demands to the general military population unclear. This pilot study aimed to quantify, for the first time, the physiological demands of basic fire management tasks in the military, to determine if they are reflected in the FORCE JBST minimum standard. We hypothesized that the physiological demands of basic fire management tasks within the CAF are below the physiological demands of the FORCE JBST minimum standard, and as such, be lower than the demands of professional firefighting. Materials and methods To achieve this, 21 CAF members (8 females; 13 males; mean [SD] age: 33 [10] years; height: 174.5 [10.5] cm; weight: 85.4 [22.1] kg, estimated maximal oxygen uptake [$\dot V$O2peak]: 44.4 (7.4) mL kg−1 min−1) participated in a realistic, but physically demanding, JBST developed by CAF professional firefighting subject matter experts. The actions included lifting, carrying, and manipulating a 13-kg powder fire extinguisher and connecting, coupling, and dragging a 38-mm fire hose over 30 m. The rate of oxygen uptake ($\dot V$O2), heart rate, and percentage of heart rate reserve were measured continuously during two task simulation trials, which were interspersed by a recovery period. Rating of perceived exertion (6-no exertion; 20-maximal exertion) was measured upon completion of both task simulations. Peak $\dot V$O2 ($\dot V$O2peak) was estimated based on the results of the FORCE JBST. Results The mean (SD) duration of both task simulation trials was 3:39 (0:19) min:s, whereas the rest period in between both trials was 62 (19) minutes. The mean O2 was 21.1 (4.7) mL kg−1 min−1 across trials, which represented 52.1 (12.2) %$\dot V$O2peak and ∼81% of the FORCE JBST. This was paralleled by a mean heart rate of 136 (18) beats min−1, mean percentage of heart rate reserve of 61.2 (10.8), and mean rating of perceived exertion of 11 ± 2. Other physical components of the JBST consisted of lifting, carrying, and manipulating a 13-kg load for ∼59 seconds, which represents 65% of the load of the FORCE JBST. The external resistance of the fire hose drag portion increased up to 316 N, translating to a total of 6205 N over 30 m, which represents 96% of the drag force measured during the FORCE JBST. Conclusions Our findings demonstrate that the physiological demands of basic fire management tasks in the CAF are of moderate intensity, which are reflected in the CAF physical fitness standard. As such, CAF members who achieve the minimum standard on the FORCE JBST are deemed capable of physically performing basic fire management tasks during fire emergencies.


Author(s):  
Alexandru Nicolae Ungureanu ◽  
Corrado Lupo ◽  
Gennaro Boccia ◽  
Paolo Riccardo Brustio

Purpose: The primary aim of this study was to evaluate whether the internal (session rating of perceived exertion [sRPE] and Edwards heart-rate-based method) and external training load (jumps) affect the presession well-being perception on the day after (ie, +22 h), according to age and tactical position, in elite (ie, Serie A2) female volleyball training. Methods: Ten female elite volleyball players (age = 23 [4] y, height = 1.82 [0.04] m, body mass = 73.2 [4.9] kg) had their heart rate monitored during 13 team (115 individual) training sessions (duration: 101 [8] min). Mixed-effect models were applied to evaluate whether sRPE, Edwards method, and jumps were correlated (P ≤ .05) to Hooper index factors (ie, perceived sleep quality/disorders, stress level, fatigue, and delayed-onset muscle soreness) in relation to age and tactical position (ie, hitters, central blockers, opposites, and setters). Results: The results showed a direct relationship between sRPE (P < .001) and presession well-being perception 22 hours apart, whereas the relationship was the inverse for Edwards method internal training load. Age, as well as the performed jumps, did not affect the well-being perception of the day after. Finally, central blockers experienced a higher delayed-onset muscle soreness than hitters (P = .003). Conclusions: Findings indicated that female volleyball players’ internal training load influences the pretraining well-being status on the day after (+ 22 h). Therefore, coaches can benefit from this information to accurately implement periodization in a short-term perspective and to properly adopt recovery strategies in relation to the players’ well-being status.


2018 ◽  
Vol 3 (3) ◽  
pp. 2473011418S0049
Author(s):  
Alicia Unangst ◽  
Kevin Martin ◽  
Anthony Mustovich ◽  
Jaime Chisholm

Category: Ankle Introduction/Purpose: Following lower extremity surgery patients are often required to utilize assistive devices in order to perform activities of daily living. As technology and assistive devices continue to improve, providers are faced with selecting a device that is safe while providing high patient satisfaction and a quick return to actives. The purpose of the current study was to compare physical exertion and subject preference between a hands-free single crutch and standard axillary crutches in foot and ankle patients. Methods: A prospective, randomized crossover study was performed using 35 orthopedic foot and ankle patients from within one treatment facility. Each participant had demographic data and heart rate recorded. The patients were then randomized to an assistive device. All participants completed a 6-minute walk test (6MWT); immediately following each 6MWT heart rate, self-selected walking velocity (SSWV), perceived exertion using OMNI Rating of Perceived Exertion (OMNI-RPE) and perceived dyspnea using Modified Borg Dyspnea Scale was obtained. The patients then completed another 6MWT using the other assistive device and was asked the same questions. After completing both 6MWTs participants were asked which assistive device they would prefer to use. Results: A total of 35 patients were included with a median age of 32-year-old. The hands-free crutch was preferred by 86% of participants. Regression analysis was used to test if factors such as gender, height, weight, BMI predicted patient preference of iWalk vs. Crutch. None of these factors were found to be significant. Student t-tests and ANOVAs were performed separately for dyspnea, fatigue ratings, distance (meters) and heart rate between iWalk and crutch all were found to be significant (p<0.05, p=1.13e-11, p=2.29e-13, p=5.21e-05, respectively). The axillary crutch group had higher SSWV (0.8 vs 0.77m/s) but was not found to be significant. Neither group had any falls, however, 58% of axillary participants complained of axillary/hand pain while the hands-free group had 14% complain of proximal strap discomfort. Conclusion: Patients preferred the hands-free crutch while reporting lower perceived dyspnea and fatigue. The hands-free group demonstrated lower physiologic demand, which correlated with patient perception.


Sign in / Sign up

Export Citation Format

Share Document