scholarly journals The Relationship Between Intensity Indicators in Small-Sided Soccer Games

2015 ◽  
Vol 46 (1) ◽  
pp. 119-128 ◽  
Author(s):  
David Casamichana ◽  
Julen Castellano

AbstractThe aim of the present study was to examine the relationship between different kinds of intensity indicators in small-sided soccer games. This descriptive correlational study included 14 semi-professional male soccer players (21.3 ± 2.3 years, 174 ± 4.0 cm, 73.4 ± 5.1 kg) from the same team. The players were monitored by means of heart rate monitors and GPS devices during 27 small-sided games of nine different formats, yielding a total of 217 recordings. After each game the Borg scale was used to give a rate of perceived exertion (RPE). The internal load indicators were the mean heart rate relative to the individual maximum (%HRmean) and the RPE, while those for the external load were the player load, total distance covered, distance covered in two intensity ranges (>18 km·h-1 and >21 km·h-1), and frequency of effort (in the same two intensity ranges). There was a significant moderate correlation (r=0.506) between the two internal load measurements (%HRmean and RPE). Although there were significant correlations of different degrees between various external load measurements, only the player load was significantly correlated with the internal load indicators (r=0.331 with %HRmean and r=0.218 with RPE). During training programes of this kind, it is necessary to consider a range of intensity indicators so as to obtain complementary information. This will enable coaches to more accurately assess the load imposed on players and therefore optimize the training process.

Retos ◽  
2021 ◽  
Vol 44 ◽  
pp. 534-541
Author(s):  
Alberto Rodríguez Cayetano ◽  
Óscar Martín Martín ◽  
Félix Hernández Merchán ◽  
Salvador Pérez Muñoz

  El objetivo principal de esta investigación es cuantificar la carga externa y la carga interna en tres tipos de entrenamiento (cubos con la mano, cubos con raqueta y peloteos) más utilizados en el ámbito del tenis de competición y compararlos entre sí. Participaron 6 jugadores de tenis (cuatro jugadores masculinos y 2 jugadoras) con una media de edad de 16.67 (± 2.73) años. Para cuantificar las cargas, se han utilizado pulsómetros POLAR +M400 para recoger los datos relacionados con la frecuencia cardíaca, distancia recorrida, velocidad media y máxima, y sensores ZEPP TENNIS, con los que se han recogido los datos relativos a tipo de golpe, número de golpeos y velocidad de raqueta en cada uno de los golpes realizados. Además, para registrar la percepción subjetiva de esfuerzo, en cada tarea realizada y al final de cada entrenamiento, se ha utilizado la Escala de Borg CR-10 (Borg, 1982). Se realizaron 9 sesiones de entrenamiento por parejas: tres para el entrenamiento del drive, tres para el entrenamiento del revés y tres para el entrenamiento del drive y el revés de forma conjunta, una para cada tipo de entrenamiento. Los resultados obtenidos muestran que el entrenamiento de cubos con la mano tiene mayor carga interna en cuanto a número de golpeos y velocidad media de raqueta, siendo el entrenamiento de peloteos el que mayor carga externa refleja en relación a velocidades y distancias recorridas. Abstract. The main objective of this research is to quantify the external load and the internal load in three types of training (buckets with the hand, buckets with racket and rallies) most used in the field of tennis and compare them to each other. Six tennis players participated (four male and two female players) with an average age of 16.67 (± 2.73) years. To quantify the loads, POLAR +M400 heart rate monitors were used to collect the data related to heart rate, distance covered, average and maximum speed, and ZEPP TENNIS sensors were used to collect the data related to type of stroke, number of strokes and racket speed for each of the strokes made. In addition, the Borg CR-10 Scale (Borg, 1982) has been used to record the rate of perceived exertion, in each task performed and at the end of each training session. Nine training sessions were carried out in pairs: three for drive training, three for backhand training and three for drive and backhand training together, one for each type of training. The results obtained show that bucket training with the hand has a greater internal load in terms of the number of strokes and average racket speed, with racket training having the greatest external load in relation to speed and distance travelled.


Author(s):  
Alice Iannaccone ◽  
Daniele Conte ◽  
Cristina Cortis ◽  
Andrea Fusco

Internal load can be objectively measured by heart rate-based models, such as Edwards’ summated heart rate zones, or subjectively by session rating of perceived exertion. The relationship between internal loads assessed via heart rate-based models and session rating of perceived exertion is usually studied through simple correlations, although the Linear Mixed Model could represent a more appropriate statistical procedure to deal with intrasubject variability. This study aimed to compare conventional correlations and the Linear Mixed Model to assess the relationships between objective and subjective measures of internal load in team sports. Thirteen male youth beach handball players (15.9 ± 0.3 years) were monitored (14 training sessions; 7 official matches). Correlation coefficients were used to correlate the objective and subjective internal load. The Linear Mixed Model was used to model the relationship between objective and subjective measures of internal load data by considering each player individual response as random effect. Random intercepts were used and then random slopes were added. The likelihood-ratio test was used to compare statistical models. The correlation coefficient for the overall relationship between the objective and subjective internal data was very large (r = 0.74; ρ = 0.78). The Linear Mixed Model using both random slopes and random intercepts better explained (p < 0.001) the relationship between internal load measures. Researchers are encouraged to apply the Linear Mixed Models rather than correlation to analyze internal load relationships in team sports since it allows for the consideration of the individuality of players.


Sports ◽  
2020 ◽  
Vol 8 (12) ◽  
pp. 165
Author(s):  
Eric J. Sobolewski

The aim of this study is to explore the relationships between internal and external load measures in American football. Thirty football players wore a portable integrated monitor unit for 10 weeks during the fall football season. Relationships between internal and external load measurements were determined. Internal load consisted of heart rate zones and heart rate-derived measures and session Ratings of Perceived Exertion (sRPE). External load consisted of distance in different speed zones, total distance traveled, and accelerations. There were many significant positive relationships, but the meaningful relationships (r > 0.5) were between heart rate-derived measures of load (Training Impulse and heart rate reserve) and low-intensity movement and total distance. Only accelerations between 1 and 1.99 m·s−2 were moderately correlated to heart rate-derived internal load. RPE values alone did not correlate strong enough with any of the measure but sRPE training load (sRPE-TL) correlated to most external values. Overall, moderate correlations were present between heart rate-derived internal load to total distance and lower intensity movement. sRPE-TL values had high correlations but were highly dependent on duration, not perceived exertion. When addressing load in American football, duration of the session is a key component in determining internal load as HR data and sRPE alone do not correlate highly with external loads.


Sports ◽  
2019 ◽  
Vol 7 (5) ◽  
pp. 107
Author(s):  
Filipe Manuel Clemente ◽  
Pantelis Theodoros Nikolaidis ◽  
Thomas Rosemann ◽  
Beat Knechtle

The purpose of this study was to compare internal and external load measures during two regimens (6 x 3’ and 3 x 6’) of a 5 vs. 5 format of play. Moreover, within-regimen changes (between sets) were also tested. Ten amateur soccer players (age: 19.8 ± 1.6 years; experience: 8.3 ± 2.1 years; height: 177.4 ± 3.8 cm; weight: 71.7 ± 4.2 kg) participated in the experiment. Internal load was measured using the CR-10 scale as the rated of perceived exertion (RPE) scale and a heart rate (HR) monitor. The measurements of total (TD), running (RD) and sprinting (SD) distances were also collected using a 10-Hz validated and reliable GPS. Comparisons between regimens revealed that the 3 x 6’ regimen was significantly more intense in terms of RPE than the 6 x 3’ regimen (p = 0.028; d = 0.351), although no significant differences were found in HR. Significantly greater averages of TD (p = 0.000; d = 0.871) and RD (p = 0.004; d = 0.491) were found in the 6 x 3’ regimen. In both regimens, the RPE was significantly lower during the first set than in the remaining sets. On the other hand, the TD was significantly shorter in the last sets than in the earlier. In summary, the present study suggests that shorter sets may be beneficial for maintaining higher internal and external load intensities during 5 vs. 5 formats, and that a drop-in performance may occur throughout the sets in both regimens.


Author(s):  
Juan Pedro Fuentes-García ◽  
Vicente J. Clemente-Suárez ◽  
Miguel Ángel Marazuela-Martínez ◽  
José F. Tornero-Aguilera ◽  
Santos Villafaina

Objective: The present research aimed to analyse the autonomic, anxiety, perceived exertion, and self-confidence response during real and simulated flights. Methods: This cross-sectional study participated 12 experienced male pilots (age = 33.08 (5.21)) from the Spanish Air Force. Participants had to complete a real and a simulated flight mission randomly. The heart rate variability (HRV), anxiety, self-confidence, and rating of perceived exertion were collected before and after both manoeuvres, and HRV was also collected during both simulated and real flights. Results: When studying the acute effects of real and simulated flights, the mean heart rate, the R-to-R interval, the cognitive anxiety and the perceived exertion were significantly impacted only by real flights. Furthermore, significant differences in the mean heart rate and RR interval were found when compared to the acute effects of real and simulated flights (with higher acute effects observed in real flights). Additionally, when compared the HRV values during simulated and real flights, significant differences were observed in the RR and heart rate mean (with lower RR interval and higher heart rate mean observed during real flights). Conclusion: Real flights significantly reduced the RR interval and cognitive anxiety while increased the heart rate mean and the rating of perceived exertion, whereas simulated flights did not induce any significant change in the autonomic modulation.


2020 ◽  
pp. 088506662098250
Author(s):  
Chad M. Conner ◽  
William H. Perucki ◽  
Andre Gabriel ◽  
David M. O’Sullivan ◽  
Antonio B. Fernandez

Introduction: There is a paucity of data evaluating the impact of heart rate (HR) during Targeted Temperature Management (TTM) and neurologic outcomes. Current resuscitation guidelines do not specify a HR goal during TTM. We sought to determine the relationship between HR and neurologic outcomes in a single-center registry dataset. Methods: We retrospectively studied 432 consecutive patients who completed TTM (33°C) after cardiac arrest from 2008 to 2017. We evaluated the relationship between neurologic outcomes and HR during TTM. Pittsburgh Cerebral Performance Categories (CPC) at discharge were used to determine neurological recovery. Statistical analysis included chi square, Student’s t-test and Mann-Whitney U. A logistic regression model was created to evaluate the strength of contribution of selected variables on the outcome of interest. Results: Approximately 94,000 HR data points from 432 patients were retrospectively analyzed; the mean HR was 82.17 bpm over the duration of TTM. Favorable neurological outcomes were seen in 160 (37%) patients. The mean HR in the patients with a favorable outcome was lower than the mean HR of those with an unfavorable outcome (79.98 bpm vs 85.67 bpm p < 0.001). Patients with an average HR of 60-91 bpm were 2.4 times more likely to have a favorable neurological outcome compared to than HR’s < 60 or > 91 (odds ratio [OR] = 2.36, 95% confidence interval [CI] 1.61-3.46, p < 0.001). Specifically, mean HR’s in the 73-82 bpm range had the greatest rate of favorable outcomes (OR 3.56, 95% CI 1.95-6.50), p < 0.001. Administration of epinephrine, a history of diabetes mellitus and hypertension all were associated with worse neurological outcomes independent of HR. Conclusion: During TTM, mean HRs between 60-91 showed a positive association with favorable outcomes. It is unclear whether a specific HR should be targeted during TTM or if heart rates between 60-91 bpm might be a sign of less neurological damage.


2021 ◽  
Vol 6 (2) ◽  
pp. 53
Author(s):  
Ricardo Lima ◽  
Henrique de Oliveira Castro ◽  
José Afonso ◽  
Gustavo De Conti Teixeira Costa ◽  
Sérgio Matos ◽  
...  

The purpose of this study was to compare the external load, internal load, and technical efficacy between the first and the second matches (M1 and M2) occurring in congested fixtures (two matches in two days) using the number of sets as a moderating factor. An observational analytic research design was adopted. Data from official volleyball matches were collected during the first competitive period of the championship, comprising 14 competitive games within 10 weeks. Ten male elite volleyball athletes (age: 21.7 ± 4.19 years of age; experience: 6.2 ± 3.8 years; body mass: 85.7 ± 8.69 kg; height: 192.4 ± 6.25 cm; BMI: 23.1 ± 1.40 kg/m2) participated in this study. Players were monitored for external load (number of jumps and height of jumps) and internal load (using the rate of perceived exertion—RPE). Additionally, notational analysis collected information about attack efficacy and receptions made during matches. The mixed ANOVA revealed no significant interaction between time (M1 vs. M2) and number of sets for number of jumps per minute (p = 0.235; ηp2 = 0.114), mean jump height (p = 0.076; ηp2 = 0.193), RPE (p = 0.261; ηp2 = 0.106), attack efficacy (p = 0.346; ηp2 = 0.085), Positive reception (p = 0.980; ηp2 = 0.002) and Perfect reception (p = 0.762; ηp2 = 0.022). In conclusion, congested fixtures do not seem to affect the performance of volleyball players negatively.


1998 ◽  
Vol 66 (2) ◽  
pp. 383-387 ◽  
Author(s):  
M. Khalid ◽  
W. Haresign ◽  
D. G. Bradley

AbstractThis study consisted of two experiments. In experiment 1, stress responses of sheep which were restrained either in a laparoscopy cradle or a roll-over cradle were compared. The results of this experiment indicated that restraint in roll-over cradle is less (P < 0·05) stressful than that in a laparoscopy cradle when assessed in terms of the elevation and duration of both the mean heart rate and plasma cortisol responses. Experiment 2 compared the stress responses of sheep subjected to restraint in a laparoscopy cradle, restraint in a laparoscopy cradle with intrauterine artificial insemination (AI) by laparoscopy, minimal restraint with cervical AI or restraint in a roll-over cradle plus foot-trimming. All treatments resulted in significant elevations in both heart rate and plasma cortisol concentrations (F < 0·001). The peak heart rate was significantly (P < 0·05) higher in ewes subjected to cervical AI than in those subjected to intrauterine insemination, with other treatments intermediate. The peak cortisol response did not differ among different treatments. The duration over which both the mean heart rate and -plasma cortisol concentrations remained significantly elevated above pre-treatment concentrations did not differ among treatment groups. The results of this study suggest that while restraint using a laparoscopy cradle is more stressful than that using a rollover cradle, the stress inflicted by intrauterine insemination by laparoscopy itself is no greater than that due to restraint using the laparoscopy cradle alone, cervical AI or the management practice offoot-trimming using a rollover cradle.


2021 ◽  
Author(s):  
Étienne Chassé ◽  
Daniel Théoret ◽  
Martin P Poirier ◽  
François Lalonde

ABSTRACT Introduction Members of the Canadian Armed Forces (CAF) are required to meet the minimum standards of the Fitness for Operational Requirements of CAF Employment (FORCE) job-based simulation test (JBST) and must possess the capacity to perform other common essential tasks. One of those tasks is to perform basic fire management tasks during fire emergencies to mitigate damage and reduce the risk of injuries and/or death until professional firefighters arrive at the scene. To date however, the physiological demands of common firefighting tasks have mostly been performed on professional firefighters, thus rendering the transferability of the demands to the general military population unclear. This pilot study aimed to quantify, for the first time, the physiological demands of basic fire management tasks in the military, to determine if they are reflected in the FORCE JBST minimum standard. We hypothesized that the physiological demands of basic fire management tasks within the CAF are below the physiological demands of the FORCE JBST minimum standard, and as such, be lower than the demands of professional firefighting. Materials and methods To achieve this, 21 CAF members (8 females; 13 males; mean [SD] age: 33 [10] years; height: 174.5 [10.5] cm; weight: 85.4 [22.1] kg, estimated maximal oxygen uptake [$\dot V$O2peak]: 44.4 (7.4) mL kg−1 min−1) participated in a realistic, but physically demanding, JBST developed by CAF professional firefighting subject matter experts. The actions included lifting, carrying, and manipulating a 13-kg powder fire extinguisher and connecting, coupling, and dragging a 38-mm fire hose over 30 m. The rate of oxygen uptake ($\dot V$O2), heart rate, and percentage of heart rate reserve were measured continuously during two task simulation trials, which were interspersed by a recovery period. Rating of perceived exertion (6-no exertion; 20-maximal exertion) was measured upon completion of both task simulations. Peak $\dot V$O2 ($\dot V$O2peak) was estimated based on the results of the FORCE JBST. Results The mean (SD) duration of both task simulation trials was 3:39 (0:19) min:s, whereas the rest period in between both trials was 62 (19) minutes. The mean O2 was 21.1 (4.7) mL kg−1 min−1 across trials, which represented 52.1 (12.2) %$\dot V$O2peak and ∼81% of the FORCE JBST. This was paralleled by a mean heart rate of 136 (18) beats min−1, mean percentage of heart rate reserve of 61.2 (10.8), and mean rating of perceived exertion of 11 ± 2. Other physical components of the JBST consisted of lifting, carrying, and manipulating a 13-kg load for ∼59 seconds, which represents 65% of the load of the FORCE JBST. The external resistance of the fire hose drag portion increased up to 316 N, translating to a total of 6205 N over 30 m, which represents 96% of the drag force measured during the FORCE JBST. Conclusions Our findings demonstrate that the physiological demands of basic fire management tasks in the CAF are of moderate intensity, which are reflected in the CAF physical fitness standard. As such, CAF members who achieve the minimum standard on the FORCE JBST are deemed capable of physically performing basic fire management tasks during fire emergencies.


Sign in / Sign up

Export Citation Format

Share Document