The Effects of Match Congestion on Physical Performance in Football Referees

Author(s):  
Víctor Moreno-Perez ◽  
Javier Courel-Ibáñez ◽  
Juan Del Coso ◽  
Javier Sánchez-Sánchez

AbstractWe examined the changes in performance during congested (two matches within a 7-day interval) and non-congested (one match within≥7-day interval) fixtures in 17 elite football (soccer) referees during 181 official matches. External demands comprised 20 GPS-based metrics. Internal load was assessed by heart rate and rating of perceived exertion. Compared to non-congested fixtures, referees decreased their running distance at 21–24 km·h−1 (p=0.027, effect size [ES]=0.41) and > 24 km·h−1 (p=0.037, ES=0.28), the number of sprints (p=0.012, ES=0.29), and distance sprinting (p=0.022, ES=0.29) in congested matches. Most play metrics were lower in congested versus non-congested fixtures with low-to-moderate ES. During the 2nd half of non-congested fixtures, referees covered larger distances at low-speed running (p=0.025, ES=0.47). Match congestion due to officiating two matches less than a week apart caused a notable decrease in match running activity in professional football referees, especially at above 21 km·h−1. These data reiterate the need for specific conditioning and post-match recovery strategies in high-level referees to ensure optimal judgment performance favouring the quality of the competition. Governing bodies should take these outcomes into account when designating referees for a match.

Author(s):  
Alice Iannaccone ◽  
Daniele Conte ◽  
Cristina Cortis ◽  
Andrea Fusco

Internal load can be objectively measured by heart rate-based models, such as Edwards’ summated heart rate zones, or subjectively by session rating of perceived exertion. The relationship between internal loads assessed via heart rate-based models and session rating of perceived exertion is usually studied through simple correlations, although the Linear Mixed Model could represent a more appropriate statistical procedure to deal with intrasubject variability. This study aimed to compare conventional correlations and the Linear Mixed Model to assess the relationships between objective and subjective measures of internal load in team sports. Thirteen male youth beach handball players (15.9 ± 0.3 years) were monitored (14 training sessions; 7 official matches). Correlation coefficients were used to correlate the objective and subjective internal load. The Linear Mixed Model was used to model the relationship between objective and subjective measures of internal load data by considering each player individual response as random effect. Random intercepts were used and then random slopes were added. The likelihood-ratio test was used to compare statistical models. The correlation coefficient for the overall relationship between the objective and subjective internal data was very large (r = 0.74; ρ = 0.78). The Linear Mixed Model using both random slopes and random intercepts better explained (p < 0.001) the relationship between internal load measures. Researchers are encouraged to apply the Linear Mixed Models rather than correlation to analyze internal load relationships in team sports since it allows for the consideration of the individuality of players.


2020 ◽  
Vol 15 (7) ◽  
pp. 958-963
Author(s):  
Paulo H.C. Mesquita ◽  
Emerson Franchini ◽  
Marco A. Romano-Silva ◽  
Guilherme M. Lage ◽  
Maicon R. Albuquerque

Purpose: To investigate the effects of anodal transcranial direct current stimulation (a-tDCS) on the aerobic performance, heart rate (HR), and rating of perceived exertion (RPE) of highly trained taekwondo athletes. Methods: Twelve (8 men and 4 women) international/national-level athletes received a-tDCS or sham treatment over the M1 location in a randomized, single-blind crossover design. The stimulation was delivered at 1.5 mA for 15 min using an extracephalic bihemispheric montage. Athletes performed the progressive-specific taekwondo test 10 min after stimulation. HR was monitored continuously during the test, and RPE was registered at the end of each stage and at test cessation. Results: There were no significant differences between sham and a-tDCS in time to exhaustion (14.6 and 14.9, respectively, P = .53, effect size = 0.15) and peak kicking frequency (52 and 53.6, respectively, P = .53, effect size = 0.15) or in HR (P > .05) and RPE responses (P > .05). Conclusions: Extracephalic bihemispheric a-tDCS over M1 did not influence the aerobic performance of taekwondo athletes or their psychophysiological responses, so athletes and staff should be cautious when using it in a direct-to-consumer manner.


Author(s):  
Olli-Pekka Nuuttila ◽  
Heikki Kyröläinen ◽  
Keijo Häkkinen ◽  
Ari Nummela

AbstractThis study investigated acute responses and post 24-h recovery to four running sessions performed at different intensity zones by supine heart rate variability, countermovement jump, and a submaximal running test. A total of 24 recreationally endurance-trained male subjects performed 90 min low-intensity (LIT), 30 min moderate-intensity (MOD), 6×3 min high-intensity interval (HIIT) and 10×30 s supramaximal-intensity interval (SMIT) exercises on a treadmill. Heart rate variability decreased acutely after all sessions, and the decrease was greater after MOD compared to LIT and SMIT (p<0.001; p<0.01) and HIIT compared to LIT (p<0.01). Countermovement jump decreased only after LIT (p<0.01) and SMIT (p<0.001), and the relative changes were different compared to MOD (p<0.01) and HIIT (p<0.001). Countermovement jump remained decreased at 24 h after SMIT (p<0.05). Heart rate during the submaximal running test rebounded below the baseline 24 h after all sessions (p<0.05), while the rating of perceived exertion during the running test remained elevated after HIIT (p<0.05) and SMIT (p<0.01). The current results highlight differences in the physiological demands of the running sessions, and distinct recovery patterns of the measured aspects of performance. Based on these results, assessments of performance and recovery from multiple perspectives may provide valuable information for endurance athletes, and help to improve the quality of training monitoring.


Author(s):  
Jessica Lynne Bigg ◽  
Alexander Shand Davis Gamble ◽  
Lawrence L. Spriet

AbstractThis study quantified internal load, using sessional rating of perceived exertion (sRPE) and heart-rate derived training impulse (TRIMP), of female varsity ice hockey players throughout a season. Twenty-four female (19.8±1.4 yr, 68.0±6.9 kg) varsity ice hockey players participated in this prospective cohort study. Internal load was captured using sRPE and TRIMP for each on-ice session. Internal load was significantly higher (p<0.05) for games (sRPE: 324±202 AU, TRIMP: 95±60 AU) compared to training (sRPE: 248±120 AU, TRIMP: 68±32 AU). Overall, goalies had a higher internal load than forwards (sRPE and TRIMP) and defence (TRIMP), with no differences between forwards and defence. Micro-cycle periodization was present, with training sessions several days prior to game days having the highest internal load (sRPE and TRIMP) and tapering down as subsequent training sessions approached game day. For the meso-cycle assessment, for both training and competition combined, the post-season sRPE was greater than the pre-season (p=0.002) and regular season (p<0.001). Lastly, the association between sRPE and TRIMP, revealed a large, statistically significant relationship (r=0.592, p<0.001). Internal load was greater during competitions, training sessions and subsequent internal loads suggested prioritization around game days, the post-season phase demanded the highest internal load and there was a strong correlation between sRPE and TRIMP.


2016 ◽  
Vol 11 (8) ◽  
pp. 1024-1028 ◽  
Author(s):  
Sam S.X. Wu ◽  
Jeremiah J. Peiffer ◽  
Peter Peeling ◽  
Jeanick Brisswalter ◽  
Wing Y. Lau ◽  
...  

Purpose:To investigate the effect of 3 swim-pacing profiles on subsequent performance during a sprint-distance triathlon (SDT). Methods:Nine competitive/trained male triathletes completed 5 experimental sessions including a graded running exhaustion test, a 750-m swim time trial (STT), and 3 SDTs. The swim times of the 3 SDTs were matched, but pacing was manipulated to induce positive (ie, speed gradually decreasing from 92% to 73% STT), negative (ie, speed gradually increasing from 73% to 92% STT), or even pacing (constant 82.5% STT). The remaining disciplines were completed at a self-selected maximal pace. Speed over the entire triathlon, power output during the cycle discipline, rating of perceived exertion (RPE) for each discipline, and heart rate during the cycle and run were determined. Results:Faster cycle and overall triathlon times were achieved with positive swim pacing (30.5 ± 1.8 and 65.9 ± 4.0 min, respectively), as compared with the even (31.4 ± 1.0 min, P = .018 and 67.7 ± 3.9 min, P = .034, effect size [ES] = 0.46, respectively) and negative (31.8 ± 1.6 min, P = .011 and 67.3 ± 3.7 min, P = .041, ES = 0.36, respectively) pacing. Positive swim pacing elicited a lower RPE (9 ± 2) than negative swim pacing (11 ± 2, P = .014). No differences were observed in the other measured variables. Conclusions:A positive swim pacing may improve overall SDT performance and should be considered by both elite and age-group athletes during racing.


2019 ◽  
Vol 14 (10) ◽  
pp. 1331-1337 ◽  
Author(s):  
Aaron T. Scanlan ◽  
Robert Stanton ◽  
Charli Sargent ◽  
Cody O’Grady ◽  
Michele Lastella ◽  
...  

Purpose: To quantify and compare internal and external workloads in regular and overtime games and examine changes in relative workloads during overtime compared with other periods in overtime games in male basketball players. Methods: Starting players for a semiprofessional male basketball team were monitored during 2 overtime games and 2 regular games (nonovertime) with similar contextual factors. Internal (rating of perceived exertion and heart-rate variables) and external (PlayerLoad and inertial movement analysis variables) workloads were quantified across games. Separate linear mixed-models and effect-size analyses were used to quantify differences in variables between regular and overtime games and between game periods in overtime games. Results: Session rating-of-perceived-exertion workload (P = .002, effect size 2.36, very large), heart-rate workload (P = .12, 1.13, moderate), low-intensity change-of-direction events to the left (P = .19, 0.95, moderate), medium-intensity accelerations (P = .12, 1.01, moderate), and medium-intensity change-of-direction events to the left (P = .10, 1.06, moderate) were higher during overtime games than during regular games. Overtime periods also exhibited reductions in relative PlayerLoad (first quarter P = .03, −1.46, large), low-intensity accelerations (first quarter P = .01, −1.45, large; second quarter P = .15, −1.22, large), and medium-intensity accelerations (first quarter P = .09, −1.32, large) compared with earlier periods. Conclusions: Overtime games disproportionately elevate perceptual, physiological, and acceleration workloads compared with regular games in starting basketball players. Players also perform at lower external intensities during overtime periods than earlier quarters during basketball games.


2016 ◽  
Vol 11 (5) ◽  
pp. 587-593 ◽  
Author(s):  
Richard Akenhead ◽  
George P. Nassis

Training load (TL) is monitored with the aim of making evidence-based decisions on appropriate loading schemes to reduce injuries and enhance team performance. However, little is known in detail about the variables of load and methods of analysis used in high-level football. Therefore, the aim of this study was to provide information on the practices and practitioners’ perceptions of monitoring in professional clubs. Eighty-two high-level football clubs from Europe, the United States, and Australia were invited to answer questions relating to how TL is quantified, how players’ responses are monitored, and their perceptions of the effectiveness of monitoring. Forty-one responses were received. All teams used GPS and heart-rate monitors during all training sessions, and 28 used rating of perceived exertion. The top-5-ranking TL variables were acceleration (various thresholds), total distance, distance covered above 5.5 m/s, estimated metabolic power, and heart-rate exertion. Players’ responses to training are monitored using questionnaires (68% of clubs) and submaximal exercise protocols (41%). Differences in expected vs actual effectiveness of monitoring were 23% and 20% for injury prevention and performance enhancement, respectively (P < .001 d = 1.0−1.4). Of the perceived barriers to effectiveness, limited human resources scored highest, followed by coach buy-in. The discrepancy between expected and actual effectiveness appears to be due to suboptimal integration with coaches, insufficient human resources, and concerns over the reliability of assessment tools. Future approaches should critically evaluate the usefulness of current monitoring tools and explore methods of reducing the identified barriers to effectiveness.


2021 ◽  
Vol 12 ◽  
Author(s):  
Rasmus Pind ◽  
Peter Hofmann ◽  
Evelin Mäestu ◽  
Eno Vahtra ◽  
Priit Purge ◽  
...  

Purpose: The aim of this study was to investigate the interaction of training load quantification using heart rate (HR) and rating of perceived exertion (RPE)-based methodology, and the relationship between internal training load parameters and subjective training status (Fatigue) in high-level rowers during volume increased low-intensity training period.Methods: Training data from 19 high-level rowers (age 23.5 ± 5.9 years; maximal oxygen uptake 58.9 ± 5.8 ml·min−1·kg−1) were collected during a 4-week volume increased training period. All individual training sessions were analyzed to quantify training intensity distribution based on the HR time-in-zone method (i.e., HR Z1, HR Z2, and HR Z3) determined by the first and second ventilatory thresholds (VT1/VT2). Internal training load was calculated using session RPE (sRPE) to categorize training load by effort (i.e., sRPE1, sRPE2, and sRPE3). The Recovery-Stress Questionnaire for Athletes (RESTQ-Sport) questionnaire was implemented after every week of the study period.Results: No differences were found between the respective HR and effort-based zone distributions during the baseline week (p &gt; 0.05). Compared to HR Z1, sRPE1 was significantly lower in weeks 2–4 (p &lt; 0.05), while sRPE2 was higher in weeks 2–3 compared to HR Z2 (p &lt; 0.05) and, in week 4, the tendency (p = 0.06) of the higher amount of sRPE3 compared to HR Z3 was found. There were significant increases in RESTQ-Sport stress scales and decreases in recovery scales mostly during weeks 3 and 4. Increases in the Fatigue scale were associated with the amounts of sRPE2 and sRPE3 (p = 0.011 and p = 0.008, respectively), while no associations with Fatigue were found for HR-based session quantification with internal or external training load variables.Conclusion: During a low-intensity 4-week training period with increasing volume, RPE-based training quantification indicated a shift toward the harder rating of sessions with unchanged HR zone distributions. Moderate and Hard rated sessions were related to increases in Fatigue. Session rating of perceived exertion and effort-based training load could be practical measures in combination with HR to monitor adaptation during increased volume, low-intensity training period in endurance athletes.


Author(s):  
Jessica L. Bigg ◽  
Alexander S.D. Gamble ◽  
Lawrence L. Spriet

Purpose: The purpose of this study was to quantify the internal load of male varsity ice hockey players, using both sessional rating of perceived exertion (sRPE) and the heart rate–derived physiological measure of training impulse (TRIMP), during training sessions and competitions throughout an entire season. Methods: Twenty-seven male varsity ice hockey players (22.1 [1.1] y, 85.9 [5.4] kg, 181.3 [5.1] cm) were included in this longitudinal prospective cohort study. Results: The internal load was significantly higher (P < .001) for games (sRPE: 403 [184] arbitrary units [AU], TRIMP: 98 [59] AU) compared with training sessions (sRPE: 281 [130] AU, TRIMP: 71 [35] AU). The regular season had the highest internal load compared with the preseason and postseason. There was evidence of microcycle periodization with training sessions several days prior to game days having the highest internal load (both sRPE and TRIMP) and tapering down as the subsequent training sessions approached game day. For positional comparisons, the goalies had higher sRPE (346 [151] AU, P < .001) and TRIMP (99 [64] AU, P < .001) compared with defense (sRPE: 295 [130] AU, TRIMP: 65 [29] AU) and forwards (sRPE: 264 [123] AU, TRIMP: 70 [30] AU) for training sessions, but no significant differences were present for competitions. Finally, there was an overall moderate and statistically significant relationship between the sRPE and TRIMP internal load measures (r = .434, P < .001). Conclusions: Internal load was greater during competitions versus training sessions in male varsity ice hockey players, and the microcycle assessment demonstrated that training sessions were tailored to game day. Mesocycle assessment revealed the highest internal loads during the regular season due to dense game scheduling and a short season.


2018 ◽  
Vol 13 (9) ◽  
pp. 1169-1174 ◽  
Author(s):  
Paul G. Montgomery ◽  
Brendan D. Maloney

Purpose: To determine the demands of elite male and female 3 × 3 basketball games and compare these between various competition levels. Methods: A total of 361 males and 208 females competing in the Under-18 World, Senior European, and World Championships and selected professional tournaments had game demands assessed by wearable technology (global positioning system, inertial measurement, heart rate) along with postgame blood lactate and perceived responses. Differences in the means were compared using magnitude-based inferences and reported with effect size and 90% confidence limits, along with the percentage difference (effect size; ±90% confidence limits, %) of log-transformed data. Results: PlayerLoad™ and PlayerLoad·min−1 during play was 127.5 (31.1) and 6.7 (1.5) for males and 128.5 (32.0) and 6.5 (1.4) for females, respectively, with small differences between junior, senior, and professional levels. There were small differences in accelerations >3.5 m·s−1 between competition levels up to 0.31; ±0.20, 6.9% for males and 0.29; ±0.19, 10.8% for females and for decelerations >3.5 m·s−1, 0.29; ±0.19, 15.6% for males and 0.26; ±0.19, 5.4% for females, with European Championships generally greater than other levels. Average game heart rate was 165 (18) and 164 (12) beats·min−1 for males and females, with no difference between levels. Average rating of perceived exertion was 5.7 (2.1) and 5.4 (2.0) for males and females. Conclusions: 3 × 3 basketball games require high-speed inertial movements within limited distance, creating a relatively high physiological response. Practitioners working with 3 × 3 players should endeavor to focus on the attributes that will improve these player characteristics for greater success.


Sign in / Sign up

Export Citation Format

Share Document