High-Acceleration Movement, Muscle Damage, and Perceived Exertion in Basketball Games

Author(s):  
Takeshi Koyama ◽  
Akira Rikukawa ◽  
Yasuharu Nagano ◽  
Shogo Sasaki ◽  
Hiroshi Ichikawa ◽  
...  

Purpose: To evaluate the effect of the number of high-acceleration movements on muscle damage and the rating of perceived exertion (RPE) in basketball games. Methods: Twenty-one male collegiate basketball players (mean age, 20.0 [1.0] y) were included. A triaxial accelerometer was used to measure acceleration in basketball-simulated scrimmages. To detect higher physical load during the actual game, the resultant acceleration was calculated, and 3 thresholds were set: >4G, >6G, and >8G resultant accelerations. The number of the extracted movements was calculated at each acceleration threshold. Plasma creatine kinase (CK) levels (marker of muscle damage) were estimated before and 24 hours after the match, and the session-RPE load was calculated within 30 minutes after the match. Pearson product-moment correlations with 95% confidence intervals were used to determine the relationships between the number of high-acceleration movements and plasma CK and session-RPE load. Results: Significant correlations were observed between the number of high-acceleration movements >8G and CK level (r = .74; 95% confidence interval, 0.44–0.89; P < .0001). Furthermore, the correlation coefficient between acceleration and CK increased with increased acceleration threshold (>4G: r = .65; >6G: r = .69). Contrastingly, the correlation coefficient between acceleration and the session-RPE load decreased with increased acceleration threshold (>4G: r = .72; >6G: r = .52; >8G: r = .43). Conclusions: The session-RPE reflects the total amount of movement, while the high-acceleration movement reflects the momentary large impact load or intensity, and they evaluated different factors. Basketball coaching and conditioning professionals recommended combining acceleration and session-RPE when monitoring the load of athletes.

Author(s):  
Markus N.C. Williams ◽  
Vincent J. Dalbo ◽  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Aaron T. Scanlan

Purpose: To compare weekly training and game demands according to playing position in basketball players. Methods: A longitudinal, observational study was adopted. Semiprofessional, male basketball players categorized as backcourt (guards; n = 4) and frontcourt players (forwards/centers; n = 4) had their weekly workloads monitored across an entire season. External workload was determined using microsensors and included PlayerLoad™ (PL) and inertial movement analysis variables. Internal workload was determined using heart rate to calculate absolute and relative summated-heart-rate-zones workload and rating of perceived exertion (RPE) to calculate session-RPE workload. Comparisons between weekly training and game demands were made using linear mixed models and effect sizes in each positional group. Results: In backcourt players, higher relative PL (P = .04, very large) and relative summated-heart-rate-zones workload (P = .007, very large) were evident during training, while greater session-RPE workload (P = .001, very large) was apparent during games. In frontcourt players, greater PL (P < .001, very large), relative PL (P = .019, very large), peak PL intensities (P < .001, moderate), high-intensity inertial movement analysis events (P = .002, very large), total inertial movement analysis events (P < .001, very large), summated-heart-rate-zones workload (P < .001, very large), RPE (P < .001, very large), and session-RPE workload (P < .001, very large) were evident during games. Conclusions: Backcourt players experienced similar demands between training and games across several variables, with higher average workload intensities during training. Frontcourt players experienced greater demands across all variables during games than training. These findings emphasize the need for position-specific preparation strategies leading into games in basketball teams.


2019 ◽  
Vol 14 (7) ◽  
pp. 941-948 ◽  
Author(s):  
Henrikas Paulauskas ◽  
Rasa Kreivyte ◽  
Aaron T. Scanlan ◽  
Alexandre Moreira ◽  
Laimonas Siupsinskas ◽  
...  

Purpose:To assess the weekly fluctuations in workload and differences in workload according to playing time in elite female basketball players.Methods:A total of 29 female basketball players (mean [SD] age 21 [5] y, stature 181 [7] cm, body mass 71 [7] kg, playing experience 12 [5] y) belonging to the 7 women’s basketball teams competing in the first-division Lithuanian Women’s Basketball League were recruited. Individualized training loads (TLs) and game loads (GLs) were assessed using the session rating of perceived exertion after each training session and game during the entire in-season phase (24 wk). Percentage changes in total weekly TL (weekly TL + GL), weekly TL, weekly GL, chronic workload, acute:chronic workload ratio, training monotony, and training strain were calculated. Mixed linear models were used to assess differences for each dependent variable, with playing time (low vs high) used as fixed factor and subject, week, and team as random factors.Results:The highest changes in total weekly TL, weekly TL, and acute:chronic workload ratio were evident in week 13 (47%, 120%, and 49%, respectively). Chronic workload showed weekly changes ≤10%, whereas monotony and training strain registered highest fluctuations in weeks 17 (34%) and 15 (59%), respectively. A statistically significant difference in GL was evident between players completing low and high playing times (P = .026, moderate), whereas no significant differences (P > .05) were found for all other dependent variables.Conclusions:Coaches of elite women’s basketball teams should monitor weekly changes in workload during the in-season phase to identify weeks that may predispose players to unwanted spikes and adjust player workload according to playing time.


2020 ◽  
Vol 15 (10) ◽  
pp. 1476-1479
Author(s):  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Aaron T. Scanlan

Purpose: To compare the concurrent validity of session-rating of perceived exertion (sRPE) workload determined face-to-face and via an online application in basketball players. Methods: Sixteen semiprofessional, male basketball players (21.8 [4.3] y, 191.2 [9.2] cm, 85.0 [15.7] kg) were monitored during all training sessions across the 2018 (8 players) and 2019 (11 players) seasons in a state-level Australian league. Workload was reported as accumulated PlayerLoad (PL), summated-heart-rate-zones (SHRZ) workload, and sRPE. During the 2018 season, rating of perceived exertion (RPE) was determined following each session via individualized face-to-face reporting. During the 2019 season, RPE was obtained following each session via a phone-based, online application. Repeated-measures correlations with 95% confidence intervals were used to determine the relationships between sRPE collected using each method and other workload measures (PL and SHRZ) as indicators of concurrent validity. Results: Although all correlations were significant (P < .05), sRPE obtained using face-to-face reporting demonstrated stronger relationships with PL (r = .69 [.07], large) and SHRZ (r = .74 [.06], very large) compared with the online application (r = .29 [.25], small [PL] and r = .34 [.22], moderate [SHRZ]). Conclusions: Concurrent validity of sRPE workload was stronger when players reported RPE in an individualized, face-to-face manner compared with using a phone-based online application. Given the weaker relationships with other workload measures, basketball practitioners should be cautious when using player training workloads predicated on RPE obtained via online applications.


2017 ◽  
Vol 57 (1) ◽  
pp. 139-146 ◽  
Author(s):  
James Fisher ◽  
Thomas Clark ◽  
Katherine Newman-Judd ◽  
Josh Arnold ◽  
James Steele

AbstractTime-trials represent an ecologically valid approach to assessment of endurance performance. Such information is useful in the application of testing protocols and estimation of sample sizes required for research/magnitude based inference methods. The present study aimed to investigate the intra-subject variability of 5 km time-trial running performance in trained runners. Six competitive trained male runners (age = 33.8 ± 10.1 years; stature = 1.78 ± 0.01 m; body mass = 69.0 ± 10.4 kg, $\it V^{.}$ O2max = 62.6 ± 11.0 ml·kg·min-1) completed an incremental exercise test to volitional exhaustion followed by 5 x 5 km time-trials (including a familiarisation trial), individually spaced by 48 hours. The time taken to complete each trial, heart rate, rating of perceived exertion and speed were all assessed. Intra-subject absolute standard error of measurement and the coefficient of variance were calculated for time-trial variables in addition to the intra-class correlation coefficient for time taken to complete the time-trial. For the primary measure time, results showed a coefficient of variation score across all participants of 1.5 ± 0.59% with an intra-class correlation coefficient score of 0.990. Heart rate, rating of perceived exertion and speed data showed a variance range between 0.8 and 3.05%. It was concluded that when compared with related research, there was observed low intra-subject variability in trained runners over a 5 km distance. This supports the use of this protocol for 5 km time-trial performance for assessment of nutritional strategies, ergogenic aids or training interventions on endurance running performance.


2020 ◽  
Vol 15 (4) ◽  
pp. 548-553 ◽  
Author(s):  
Corrado Lupo ◽  
Alexandru Nicolae Ungureanu ◽  
Riccardo Frati ◽  
Matteo Panichi ◽  
Simone Grillo ◽  
...  

Purpose: To monitor elite youth female basketball training to verify whether players’ and coaches’ (3 technical coaches and 1 physical trainer) session rating of perceived exertion (s-RPE) has a relationship with Edwards’ method. Methods: Heart rate of 15 elite youth female basketball players (age 16.7 [0.5] y, height 178 [9] cm, body mass 72 [9] kg, body mass index 22.9 [2.2] kg·m−2) was monitored during 19 team (268 individual) training sessions (102 [15] min). Mixed effect models were applied to evaluate whether s-RPE values were significantly (P ≤ .05) related to Edwards’ data, total session duration, maximal intensity (session duration at 90–100% HRmax), type of training (ie, strength, conditioning, and technique), and whether differences emerged between players’ and coaches’ s-RPE values. Results: The results showed that there is a relationship between s-RPE and Edwards’ methods for the players’ RPE scores (P = .019) but not for those of the trainers. In addition, as expected, both players’ (P = .014) and coaches’ (P = .002) s-RPE scores were influenced by total session duration but not by maximal intensity and type of training. In addition, players’ and coaches’ s-RPE values differed (P < .001)—post hoc differences emerged for conditioning (P = .01) and technique (P < .001) sessions. Conclusions: Elite youth female basketball players are better able to quantify the internal training load of their sessions than their coaches, strengthening the validity of s-RPE as a tool to monitor training in team sports.


PeerJ ◽  
2018 ◽  
Vol 6 ◽  
pp. e4250 ◽  
Author(s):  
Giuseppe Marcolin ◽  
Nicola Camazzola ◽  
Fausto Antonio Panizzolo ◽  
Davide Grigoletto ◽  
Antonio Paoli

Background In basketball a maximum accuracy at every game intensity is required while shooting. The aim of the present study was to investigate the acute effect of three different drill intensity simulation protocols on jump shot accuracy in expert and junior basketball players. Materials & Methods Eleven expert players (age 26 ± 6 yrs, weight 86 ± 11 kg, height 192 ± 8 cm) and ten junior players (age 18 ± 1 yrs, weight 75 ± 12 kg, height 184 ± 9 cm) completed three series of twenty jump shots at three different levels of exertion. Counter Movement Jump (CMJ) height was also measured after each series of jump shots. Exertion’s intensity was induced manipulating the basketball drills. Heart rate was measured for the whole duration of the tests while the rating of perceived exertion (RPE) was collected at the end of each series of shots. Results Heart rate and rating of perceived exertion (RPE) were statistically different in the three conditions for both expert and junior players. CMJ height remained almost unchanged in both groups. Jump shot accuracy decreased with increasing drills intensity both in experts and junior players. Expert players showed higher accuracy than junior players for all the three levels of exertion (83% vs 64%, p < 0.001; 75% vs 57%, p < 0.05; 76% vs 60%, p < 0.01). Moreover, for the most demanding level of exertion, experts showed a higher accuracy in the last ten shots compared to the first ten shots (82% vs 70%, p < 0.05). Discussion Experts coped better with the different exertion’s intensities, thus maintaining a higher level of performance. The introduction of technical short bouts of high-intensity sport-specific exercises into skill sessions should be proposed to improve jump shot accuracy during matches.


2008 ◽  
Vol 20 (3) ◽  
pp. 333-341 ◽  
Author(s):  
Michael R. McGuigan ◽  
Abdulaziz Al Dayel ◽  
David Tod ◽  
Carl Foster ◽  
Robert U. Newton ◽  
...  

The purpose of this study was to investigate the use of the OMNI Resistance Exercise scale (OMNI-RES) for monitoring the intensity of different modes of resistance training in children who are overweight or obese. Sixty-one children (mean age = 9.7 ± 1.4 years) performed three resistance training sessions every week for 4 weeks. Each session consisted of three sets of 3–15 repetitions of eight different resistance exercises. OMNI-RES RPE measures (0–10) were obtained following each set and following the end of the exercise session. There was a significant difference between average RPE (1.68 ± 0.61) and Session RPE (3.10 ± 1.18) during the 4 weeks of training (p < .05). There was no significant change in session RPE over the 4 weeks of training. The correlation coefficient between average and session RPE values was significant (r = .88, p < .05). The findings of the current study indicate that the RPE values are higher when OMNI-RES measures are obtained following the whole training session than when obtained following every single set of exercise. This suggests that in children the session RPE provides different information to the average RPE across the entire session.


Author(s):  
Markus N.C. Williams ◽  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Samuel Gardner ◽  
Vincent J. Dalbo ◽  
...  

Purpose: To compare weekly training, game, and overall (training and games) demands across phases of the regular season in basketball. Methods: Seven semiprofessional, male basketball players were monitored during all on-court team-based training sessions and games during the regular season. External monitoring variables included PlayerLoad™ and inertial movement analysis events per minute. Internal monitoring variables included a modified summated heart rate zones model calculated per minute and rating of perceived exertion. Linear mixed models were used to compare training, game, and overall demands between 5-week phases (early, middle, and late) of the regular season with significance set at P ≤ .05. Effect sizes were calculated between phases and interpreted as: trivial, <0.20; small, 0.20 to 0.59; moderate, 0.60 to 1.19; large, 1.20 to 1.99; very large, ≥2.00. Results: Greater (P > .05) overall inertial movement analysis events (moderate–very large) and rating of perceived exertion (moderate) were evident in the late phase compared with earlier phases. During training, more accelerations were evident in the middle (P = .01, moderate) and late (P = .05, moderate) phases compared with the early phase, while higher rating of perceived exertion (P = .04, moderate) was evident in the late phase compared with earlier phases. During games, nonsignificant, trivial–small differences in demands were apparent between phases. Conclusions: Training and game demands should be interpreted in isolation and combined given overall player demands increased as the season progressed, predominantly due to modifications in training demands given the stability of game demands. Periodization strategies administered by coaching staff may have enabled players to train at greater intensities late in the season without compromising game intensity.


2018 ◽  
Vol 13 (8) ◽  
pp. 1034-1041
Author(s):  
Maria C. Madueno ◽  
Vincent J. Dalbo ◽  
Joshua H. Guy ◽  
Kate E. Giamarelos ◽  
Tania Spiteri ◽  
...  

Purpose: To investigate the physiological and performance effects of active and passive recovery between repeated-change-of-direction sprints. Methods: Eight semiprofessional basketball players (age: 19.9 [1.5] y; stature: 183.0 [9.6] cm; body mass: 77.7 [16.9] kg; body fat: 11.8% [6.3%]; and peak oxygen consumption: 46.1 [7.6] mL·kg−1·min−1) completed 12 × 20-m repeated-change-of-direction sprints (Agility 5-0-5 tests) interspersed with 20 seconds of active (50% maximal aerobic speed) or passive recovery in a randomized crossover design. Physiological and perceptual measures included heart rate, oxygen consumption, blood lactate concentration, and rating of perceived exertion. Change-of-direction speed was measured during each sprint using the change-of-direction deficit (CODD), with summed CODD time and CODD decrement calculated as performance measures. Results: Average heart rate (7.3 [6.4] beats·min−1; P = .010; effect size (ES) = 1.09; very likely) and oxygen consumption (4.4 [5.0] mL·kg−1·min−1; P = .12; ES = 0.77; unclear) were moderately greater with active recovery compared with passive recovery across sprints. Summed CODD time (0.87 [1.01] s; P = .07; ES = 0.76, moderate; likely) and CODD decrement (8.1% [3.7%]; P < .01; ES = 1.94, large; almost certainly) were higher with active compared with passive recovery. Trivial–small differences were evident for rating of perceived exertion (P = .516; ES = 0.19; unclear) and posttest blood lactate concentration (P = .29; ES = 0.40; unclear) between recovery modes. Conclusions: Passive recovery between repeated-change-of-direction sprints may reduce the physiological stress and fatigue encountered compared with active recovery in basketball players.


2018 ◽  
Vol 13 (8) ◽  
pp. 1067-1074 ◽  
Author(s):  
Daniele Conte ◽  
Nicholas Kolb ◽  
Aaron T. Scanlan ◽  
Fabrizio Santolamazza

Purpose: To characterize the weekly training load (TL) and well-being of college basketball players during the in-season phase. Methods: Ten (6 guards and 4 forwards) male basketball players (age 20.9 [0.9] y, stature 195.0 [8.2] cm, and body mass 91.3 [11.3] kg) from the same Division I National Collegiate Athletic Association team were recruited to participate in this study. Individualized training and game loads were assessed using the session rating of perceived exertion at the end of each training and game session, and well-being status was collected before each session. Weekly changes (%) in TL, acute-to-chronic workload ratio, and well-being were determined. Differences in TL and well-being between starting and bench players and between 1-game and 2-game weeks were calculated using magnitude-based statistics. Results: Total weekly TL and acute-to-chronic workload ratio demonstrated high week-to-week variation, with spikes up to 226% and 220%, respectively. Starting players experienced a higher (most likely negative) total weekly TL and similar (unclear) well-being status compared with bench players. Game scheduling influenced TL, with 1-game weeks demonstrating a higher (likely negative) total weekly TL and similar (most likely trivial) well-being status compared with 2-game weeks. Conclusions: These findings provide college basketball coaches information to optimize training strategies during the in-season phase. Basketball coaches should concurrently consider the number of weekly games and player status (starting vs bench player) when creating individualized periodization plans, with increases in TL potentially needed in bench players, especially in 2-game weeks.


Sign in / Sign up

Export Citation Format

Share Document