Competition stage influences perceived performance but does not affect rating of perceived exertion and salivary neuro-endocrine-immune markers in elite young basketball players

2018 ◽  
Vol 188 ◽  
pp. 151-156 ◽  
Author(s):  
Ademir Felipe Schultz de Arruda ◽  
Marcelo Saldanha Aoki ◽  
Ana Carolina Paludo ◽  
Gustavo Drago ◽  
Alexandre Moreira
Author(s):  
Markus N.C. Williams ◽  
Vincent J. Dalbo ◽  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Aaron T. Scanlan

Purpose: To compare weekly training and game demands according to playing position in basketball players. Methods: A longitudinal, observational study was adopted. Semiprofessional, male basketball players categorized as backcourt (guards; n = 4) and frontcourt players (forwards/centers; n = 4) had their weekly workloads monitored across an entire season. External workload was determined using microsensors and included PlayerLoad™ (PL) and inertial movement analysis variables. Internal workload was determined using heart rate to calculate absolute and relative summated-heart-rate-zones workload and rating of perceived exertion (RPE) to calculate session-RPE workload. Comparisons between weekly training and game demands were made using linear mixed models and effect sizes in each positional group. Results: In backcourt players, higher relative PL (P = .04, very large) and relative summated-heart-rate-zones workload (P = .007, very large) were evident during training, while greater session-RPE workload (P = .001, very large) was apparent during games. In frontcourt players, greater PL (P < .001, very large), relative PL (P = .019, very large), peak PL intensities (P < .001, moderate), high-intensity inertial movement analysis events (P = .002, very large), total inertial movement analysis events (P < .001, very large), summated-heart-rate-zones workload (P < .001, very large), RPE (P < .001, very large), and session-RPE workload (P < .001, very large) were evident during games. Conclusions: Backcourt players experienced similar demands between training and games across several variables, with higher average workload intensities during training. Frontcourt players experienced greater demands across all variables during games than training. These findings emphasize the need for position-specific preparation strategies leading into games in basketball teams.


2019 ◽  
Vol 14 (7) ◽  
pp. 941-948 ◽  
Author(s):  
Henrikas Paulauskas ◽  
Rasa Kreivyte ◽  
Aaron T. Scanlan ◽  
Alexandre Moreira ◽  
Laimonas Siupsinskas ◽  
...  

Purpose:To assess the weekly fluctuations in workload and differences in workload according to playing time in elite female basketball players.Methods:A total of 29 female basketball players (mean [SD] age 21 [5] y, stature 181 [7] cm, body mass 71 [7] kg, playing experience 12 [5] y) belonging to the 7 women’s basketball teams competing in the first-division Lithuanian Women’s Basketball League were recruited. Individualized training loads (TLs) and game loads (GLs) were assessed using the session rating of perceived exertion after each training session and game during the entire in-season phase (24 wk). Percentage changes in total weekly TL (weekly TL + GL), weekly TL, weekly GL, chronic workload, acute:chronic workload ratio, training monotony, and training strain were calculated. Mixed linear models were used to assess differences for each dependent variable, with playing time (low vs high) used as fixed factor and subject, week, and team as random factors.Results:The highest changes in total weekly TL, weekly TL, and acute:chronic workload ratio were evident in week 13 (47%, 120%, and 49%, respectively). Chronic workload showed weekly changes ≤10%, whereas monotony and training strain registered highest fluctuations in weeks 17 (34%) and 15 (59%), respectively. A statistically significant difference in GL was evident between players completing low and high playing times (P = .026, moderate), whereas no significant differences (P > .05) were found for all other dependent variables.Conclusions:Coaches of elite women’s basketball teams should monitor weekly changes in workload during the in-season phase to identify weeks that may predispose players to unwanted spikes and adjust player workload according to playing time.


2020 ◽  
Vol 15 (10) ◽  
pp. 1476-1479
Author(s):  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Aaron T. Scanlan

Purpose: To compare the concurrent validity of session-rating of perceived exertion (sRPE) workload determined face-to-face and via an online application in basketball players. Methods: Sixteen semiprofessional, male basketball players (21.8 [4.3] y, 191.2 [9.2] cm, 85.0 [15.7] kg) were monitored during all training sessions across the 2018 (8 players) and 2019 (11 players) seasons in a state-level Australian league. Workload was reported as accumulated PlayerLoad (PL), summated-heart-rate-zones (SHRZ) workload, and sRPE. During the 2018 season, rating of perceived exertion (RPE) was determined following each session via individualized face-to-face reporting. During the 2019 season, RPE was obtained following each session via a phone-based, online application. Repeated-measures correlations with 95% confidence intervals were used to determine the relationships between sRPE collected using each method and other workload measures (PL and SHRZ) as indicators of concurrent validity. Results: Although all correlations were significant (P < .05), sRPE obtained using face-to-face reporting demonstrated stronger relationships with PL (r = .69 [.07], large) and SHRZ (r = .74 [.06], very large) compared with the online application (r = .29 [.25], small [PL] and r = .34 [.22], moderate [SHRZ]). Conclusions: Concurrent validity of sRPE workload was stronger when players reported RPE in an individualized, face-to-face manner compared with using a phone-based online application. Given the weaker relationships with other workload measures, basketball practitioners should be cautious when using player training workloads predicated on RPE obtained via online applications.


2020 ◽  
Vol 15 (4) ◽  
pp. 548-553 ◽  
Author(s):  
Corrado Lupo ◽  
Alexandru Nicolae Ungureanu ◽  
Riccardo Frati ◽  
Matteo Panichi ◽  
Simone Grillo ◽  
...  

Purpose: To monitor elite youth female basketball training to verify whether players’ and coaches’ (3 technical coaches and 1 physical trainer) session rating of perceived exertion (s-RPE) has a relationship with Edwards’ method. Methods: Heart rate of 15 elite youth female basketball players (age 16.7 [0.5] y, height 178 [9] cm, body mass 72 [9] kg, body mass index 22.9 [2.2] kg·m−2) was monitored during 19 team (268 individual) training sessions (102 [15] min). Mixed effect models were applied to evaluate whether s-RPE values were significantly (P ≤ .05) related to Edwards’ data, total session duration, maximal intensity (session duration at 90–100% HRmax), type of training (ie, strength, conditioning, and technique), and whether differences emerged between players’ and coaches’ s-RPE values. Results: The results showed that there is a relationship between s-RPE and Edwards’ methods for the players’ RPE scores (P = .019) but not for those of the trainers. In addition, as expected, both players’ (P = .014) and coaches’ (P = .002) s-RPE scores were influenced by total session duration but not by maximal intensity and type of training. In addition, players’ and coaches’ s-RPE values differed (P < .001)—post hoc differences emerged for conditioning (P = .01) and technique (P < .001) sessions. Conclusions: Elite youth female basketball players are better able to quantify the internal training load of their sessions than their coaches, strengthening the validity of s-RPE as a tool to monitor training in team sports.


PeerJ ◽  
2018 ◽  
Vol 6 ◽  
pp. e4250 ◽  
Author(s):  
Giuseppe Marcolin ◽  
Nicola Camazzola ◽  
Fausto Antonio Panizzolo ◽  
Davide Grigoletto ◽  
Antonio Paoli

Background In basketball a maximum accuracy at every game intensity is required while shooting. The aim of the present study was to investigate the acute effect of three different drill intensity simulation protocols on jump shot accuracy in expert and junior basketball players. Materials & Methods Eleven expert players (age 26 ± 6 yrs, weight 86 ± 11 kg, height 192 ± 8 cm) and ten junior players (age 18 ± 1 yrs, weight 75 ± 12 kg, height 184 ± 9 cm) completed three series of twenty jump shots at three different levels of exertion. Counter Movement Jump (CMJ) height was also measured after each series of jump shots. Exertion’s intensity was induced manipulating the basketball drills. Heart rate was measured for the whole duration of the tests while the rating of perceived exertion (RPE) was collected at the end of each series of shots. Results Heart rate and rating of perceived exertion (RPE) were statistically different in the three conditions for both expert and junior players. CMJ height remained almost unchanged in both groups. Jump shot accuracy decreased with increasing drills intensity both in experts and junior players. Expert players showed higher accuracy than junior players for all the three levels of exertion (83% vs 64%, p < 0.001; 75% vs 57%, p < 0.05; 76% vs 60%, p < 0.01). Moreover, for the most demanding level of exertion, experts showed a higher accuracy in the last ten shots compared to the first ten shots (82% vs 70%, p < 0.05). Discussion Experts coped better with the different exertion’s intensities, thus maintaining a higher level of performance. The introduction of technical short bouts of high-intensity sport-specific exercises into skill sessions should be proposed to improve jump shot accuracy during matches.


Author(s):  
Takeshi Koyama ◽  
Akira Rikukawa ◽  
Yasuharu Nagano ◽  
Shogo Sasaki ◽  
Hiroshi Ichikawa ◽  
...  

Purpose: To evaluate the effect of the number of high-acceleration movements on muscle damage and the rating of perceived exertion (RPE) in basketball games. Methods: Twenty-one male collegiate basketball players (mean age, 20.0 [1.0] y) were included. A triaxial accelerometer was used to measure acceleration in basketball-simulated scrimmages. To detect higher physical load during the actual game, the resultant acceleration was calculated, and 3 thresholds were set: >4G, >6G, and >8G resultant accelerations. The number of the extracted movements was calculated at each acceleration threshold. Plasma creatine kinase (CK) levels (marker of muscle damage) were estimated before and 24 hours after the match, and the session-RPE load was calculated within 30 minutes after the match. Pearson product-moment correlations with 95% confidence intervals were used to determine the relationships between the number of high-acceleration movements and plasma CK and session-RPE load. Results: Significant correlations were observed between the number of high-acceleration movements >8G and CK level (r = .74; 95% confidence interval, 0.44–0.89; P < .0001). Furthermore, the correlation coefficient between acceleration and CK increased with increased acceleration threshold (>4G: r = .65; >6G: r = .69). Contrastingly, the correlation coefficient between acceleration and the session-RPE load decreased with increased acceleration threshold (>4G: r = .72; >6G: r = .52; >8G: r = .43). Conclusions: The session-RPE reflects the total amount of movement, while the high-acceleration movement reflects the momentary large impact load or intensity, and they evaluated different factors. Basketball coaching and conditioning professionals recommended combining acceleration and session-RPE when monitoring the load of athletes.


Author(s):  
Markus N.C. Williams ◽  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Samuel Gardner ◽  
Vincent J. Dalbo ◽  
...  

Purpose: To compare weekly training, game, and overall (training and games) demands across phases of the regular season in basketball. Methods: Seven semiprofessional, male basketball players were monitored during all on-court team-based training sessions and games during the regular season. External monitoring variables included PlayerLoad™ and inertial movement analysis events per minute. Internal monitoring variables included a modified summated heart rate zones model calculated per minute and rating of perceived exertion. Linear mixed models were used to compare training, game, and overall demands between 5-week phases (early, middle, and late) of the regular season with significance set at P ≤ .05. Effect sizes were calculated between phases and interpreted as: trivial, <0.20; small, 0.20 to 0.59; moderate, 0.60 to 1.19; large, 1.20 to 1.99; very large, ≥2.00. Results: Greater (P > .05) overall inertial movement analysis events (moderate–very large) and rating of perceived exertion (moderate) were evident in the late phase compared with earlier phases. During training, more accelerations were evident in the middle (P = .01, moderate) and late (P = .05, moderate) phases compared with the early phase, while higher rating of perceived exertion (P = .04, moderate) was evident in the late phase compared with earlier phases. During games, nonsignificant, trivial–small differences in demands were apparent between phases. Conclusions: Training and game demands should be interpreted in isolation and combined given overall player demands increased as the season progressed, predominantly due to modifications in training demands given the stability of game demands. Periodization strategies administered by coaching staff may have enabled players to train at greater intensities late in the season without compromising game intensity.


2018 ◽  
Vol 13 (8) ◽  
pp. 1034-1041
Author(s):  
Maria C. Madueno ◽  
Vincent J. Dalbo ◽  
Joshua H. Guy ◽  
Kate E. Giamarelos ◽  
Tania Spiteri ◽  
...  

Purpose: To investigate the physiological and performance effects of active and passive recovery between repeated-change-of-direction sprints. Methods: Eight semiprofessional basketball players (age: 19.9 [1.5] y; stature: 183.0 [9.6] cm; body mass: 77.7 [16.9] kg; body fat: 11.8% [6.3%]; and peak oxygen consumption: 46.1 [7.6] mL·kg−1·min−1) completed 12 × 20-m repeated-change-of-direction sprints (Agility 5-0-5 tests) interspersed with 20 seconds of active (50% maximal aerobic speed) or passive recovery in a randomized crossover design. Physiological and perceptual measures included heart rate, oxygen consumption, blood lactate concentration, and rating of perceived exertion. Change-of-direction speed was measured during each sprint using the change-of-direction deficit (CODD), with summed CODD time and CODD decrement calculated as performance measures. Results: Average heart rate (7.3 [6.4] beats·min−1; P = .010; effect size (ES) = 1.09; very likely) and oxygen consumption (4.4 [5.0] mL·kg−1·min−1; P = .12; ES = 0.77; unclear) were moderately greater with active recovery compared with passive recovery across sprints. Summed CODD time (0.87 [1.01] s; P = .07; ES = 0.76, moderate; likely) and CODD decrement (8.1% [3.7%]; P < .01; ES = 1.94, large; almost certainly) were higher with active compared with passive recovery. Trivial–small differences were evident for rating of perceived exertion (P = .516; ES = 0.19; unclear) and posttest blood lactate concentration (P = .29; ES = 0.40; unclear) between recovery modes. Conclusions: Passive recovery between repeated-change-of-direction sprints may reduce the physiological stress and fatigue encountered compared with active recovery in basketball players.


2019 ◽  
Vol 44 (8) ◽  
pp. 849-856 ◽  
Author(s):  
Emilija Stojanović ◽  
Nenad Stojiljković ◽  
Aaron T. Scanlan ◽  
Vincent J. Dalbo ◽  
Ratko Stanković ◽  
...  

The aim of this study was to determine the effect of acute caffeine supplementation on anaerobic performance in professional female basketball players. A double-blind, placebo-controlled, experimental design was used in a randomized counterbalanced manner. In separate sessions, 10 professional basketball players ingested caffeine (3 mg/kg body mass) or a placebo (dextrose: 3 mg/kg body mass) 60 min before completing countermovement jumps (CMJ) with and without arm swing, a squat jump (SJ), the Lane Agility Drill, 20-m sprints (with 5-m and 10-m split times recorded) with and without dribbling a ball, and a suicide run. Participants provided ratings of perceived exertion (RPE) and ratings of perceived performance 30 min following testing. Data analyses included the use of effect size (ES) and significance. Caffeine supplementation produced small nonsignificant (p > 0.05) increases in CMJ without arm swing (ES = 0.30), CMJ with arm swing (ES = 0.29), SJ (ES = 0.33), and the lane agility drill (ES = –0.27). Caffeine supplementation produced small to moderate significant improvements in 10-m (ES = –0.63; p = 0.05) and 20-m (ES = –0.41; p = 0.04) sprint times without dribbling. Caffeine supplementation promoted a moderate significant reduction in RPE during the test battery (ES = –1.18; p = 0.04) and a small nonsignificant improvement in perceived performance (ES = 0.23; p = 0.53). Acute caffeine supplementation may produce small to moderate improvements in key performance attributes required for basketball while reducing RPE.


Sign in / Sign up

Export Citation Format

Share Document