scholarly journals Foot and Ankle Patients Prefer a Hands-Free Single Crutch Compared to Standard Axillary Crutches

2018 ◽  
Vol 3 (3) ◽  
pp. 2473011418S0049
Author(s):  
Alicia Unangst ◽  
Kevin Martin ◽  
Anthony Mustovich ◽  
Jaime Chisholm

Category: Ankle Introduction/Purpose: Following lower extremity surgery patients are often required to utilize assistive devices in order to perform activities of daily living. As technology and assistive devices continue to improve, providers are faced with selecting a device that is safe while providing high patient satisfaction and a quick return to actives. The purpose of the current study was to compare physical exertion and subject preference between a hands-free single crutch and standard axillary crutches in foot and ankle patients. Methods: A prospective, randomized crossover study was performed using 35 orthopedic foot and ankle patients from within one treatment facility. Each participant had demographic data and heart rate recorded. The patients were then randomized to an assistive device. All participants completed a 6-minute walk test (6MWT); immediately following each 6MWT heart rate, self-selected walking velocity (SSWV), perceived exertion using OMNI Rating of Perceived Exertion (OMNI-RPE) and perceived dyspnea using Modified Borg Dyspnea Scale was obtained. The patients then completed another 6MWT using the other assistive device and was asked the same questions. After completing both 6MWTs participants were asked which assistive device they would prefer to use. Results: A total of 35 patients were included with a median age of 32-year-old. The hands-free crutch was preferred by 86% of participants. Regression analysis was used to test if factors such as gender, height, weight, BMI predicted patient preference of iWalk vs. Crutch. None of these factors were found to be significant. Student t-tests and ANOVAs were performed separately for dyspnea, fatigue ratings, distance (meters) and heart rate between iWalk and crutch all were found to be significant (p<0.05, p=1.13e-11, p=2.29e-13, p=5.21e-05, respectively). The axillary crutch group had higher SSWV (0.8 vs 0.77m/s) but was not found to be significant. Neither group had any falls, however, 58% of axillary participants complained of axillary/hand pain while the hands-free group had 14% complain of proximal strap discomfort. Conclusion: Patients preferred the hands-free crutch while reporting lower perceived dyspnea and fatigue. The hands-free group demonstrated lower physiologic demand, which correlated with patient perception.

2019 ◽  
Vol 40 (10) ◽  
pp. 1203-1208
Author(s):  
Kevin D. Martin ◽  
Alicia M. Unangst ◽  
Jeannie Huh ◽  
Jamie Chisholm

Background: Weightbearing restrictions following foot and ankle surgery require the use of appropriate assistive devices for nonweightbearing ambulation during the recovery period. Selecting an appropriate assistive device that safely optimizes mobility and participation in daily activities is important to patient compliance and satisfaction. The purpose of this study was to compare physiologic demand, perceived exertion, and patient preference between a hands-free single crutch (HFSC) and standard axillary crutches (SACs) in foot and ankle patients. Methods: Using 44 preoperative orthopedic foot and ankle patients who had a mean age of 32 (19-51) years, a prospective, randomized, crossover study was performed. The sample consisted of 35 males and 9 females. The mean body mass index (BMI) was 26 (19-36), the mean height was 1.7 m, and the mean weight was 82 kg. Patient data and preactivity heart rate were recorded for all patients, who were then randomized to either an HFSC or SACs. Each patient was randomly assigned to the device they would utilize first using a random number generator. They then crossed over to the other device after vitals returned to within 10% of their baseline heart rate. Every subject completed a 6-minute walk test (6MWT) using both assistive devices in a crossover manner. Immediately following each 6MWT, postactivity heart rate, self-selected walking velocity (SSWV), perceived exertion using the OMNI Rating of Perceived Exertion (OMNI-RPE), and perceived dyspnea using the Modified Borg Dyspnea Scale were obtained. After completing both 6MWTs, patients were asked which assistive device they preferred the most. Results: The HFSC was preferred by 86% of patients. Significantly lower dyspnea scores (2.8 vs 5.3; P < .001), fatigue scores (2.4 vs 5.5; P < .001), preactivity and postactivity change in heart rate (28 vs 46 bpm; P < .001), and mean postactivity heart rate (107 vs 122 bpm; P < .001) were found using the HFSC compared with the SACs. The SAC group trended toward a higher SSWV (0.8 vs 0.77 m/s; P = .08). Those with a BMI greater than 25 also preferred iWALK over SACs ( P < .05). Neither group had any falls. Sixty-eight percent of patients complained of axillary/hand pain with the SACs, while 7% complained of proximal leg strap discomfort with the HFSC. Conclusion: The results of the current study in our relatively healthy cohort found that foot and ankle patients who were nonweightbearing preferred the HFSC over SACs. They experienced less physiologic demand as well as discomfort and perceived less exertion when using the HFSC compared with SACs. Level of Evidence: Level II, prospective comparative study.


Author(s):  
Kai Way Li ◽  
Jenn Chun Chu ◽  
Ching Chung Chen

Manual material handling (MMH) tasks create a burden for workers which could result in musculoskeletal injuries. Assessments of the decrease of muscular strength and the maximum endurance time (MET) for MMH tasks are essential in studying the ergonomic risk of MMH tasks. A backpacking experiment was conducted for measuring the MET for MMH tasks. Human participants carried a load on their back and walked on a treadmill under various load, walking speed, and ramp angle conditions until they coud no longer do so. It was found that the participants were able to walk for approximately 15 min to two hours before they needed to have a pause. Their back and leg strengths declined moderately due to performing the tasks. These tasks resulted in an increase in heart rate and elevated perceived physical exertion. The rating of perceived exertion (RPE)/heart rate ratio in our backpacking tasks was 31% higher than that in the literature, implying the calibration of the RPE may be required for such tasks. A MET model incorporating the fMVC_back, body weight, walking speed, and ramp angle was established. This model may be used to determine the work/rest allowance for backpacking tasks under conditions similar to this study.


Author(s):  
Alice Iannaccone ◽  
Daniele Conte ◽  
Cristina Cortis ◽  
Andrea Fusco

Internal load can be objectively measured by heart rate-based models, such as Edwards’ summated heart rate zones, or subjectively by session rating of perceived exertion. The relationship between internal loads assessed via heart rate-based models and session rating of perceived exertion is usually studied through simple correlations, although the Linear Mixed Model could represent a more appropriate statistical procedure to deal with intrasubject variability. This study aimed to compare conventional correlations and the Linear Mixed Model to assess the relationships between objective and subjective measures of internal load in team sports. Thirteen male youth beach handball players (15.9 ± 0.3 years) were monitored (14 training sessions; 7 official matches). Correlation coefficients were used to correlate the objective and subjective internal load. The Linear Mixed Model was used to model the relationship between objective and subjective measures of internal load data by considering each player individual response as random effect. Random intercepts were used and then random slopes were added. The likelihood-ratio test was used to compare statistical models. The correlation coefficient for the overall relationship between the objective and subjective internal data was very large (r = 0.74; ρ = 0.78). The Linear Mixed Model using both random slopes and random intercepts better explained (p < 0.001) the relationship between internal load measures. Researchers are encouraged to apply the Linear Mixed Models rather than correlation to analyze internal load relationships in team sports since it allows for the consideration of the individuality of players.


2021 ◽  
Vol 18 (1) ◽  
Author(s):  
Kelvin Euton Oliveira Carmo ◽  
Diego Ignácio Valenzuela Pérez ◽  
Charles Nardelli Valido ◽  
Jymmys Lopes dos Santos ◽  
Bianca Miarka ◽  
...  

Abstract Background Nutritional ergogenic aids are foods or nutrients that can improve physical performance. Among these foods with ergogenic properties, caffeine has shown that it can increase the fat catabolism, strength, and improve the cognition and time reaction of an athlete, therefore, it is hoped that it can improve the performance of judokas. This study through a double-blind crossover (supplement X placebo) protocol, investigated the effects caffeine supplementation (single capsule containing 5 mg/kg body mass intake 60 min before the session) on biochemical, anthropometrical, physical, subjective and hemodynamic variables measured before, during and after two typical judo trainingcxs sessions (120-min: 40-min of gymnastics; 40-min of specific technics and; 40-min of judo combat). Methods 8 high-level athletes (21.4 ± 2.0 years; 83.6 ± 15.2 kg; 1.8 ± 0.1 m; 17.9 ± 7.0 Fat%) were evaluated before and after each training for body mass, hydration, upper and lower limb power, performance in the special judo fitness test (SJFT), free fatty acids (FFA) in plasma, uric acid, glucose, lactate, heart rate, and pain. In addition, heart rate, FFA in plasma, uric acid, glucose, lactate, rating of perceived exertion and pain were assessed during the training. Results At 120 min, supplementation resulted in a higher concentration of plasma FFA (1.5 ± 0.5 vs. 1.0 ± 0.3 mmol/L; p = 0.047) and lactate (4.9 ± 1.8 vs. 3.0 ± 1.2 mmol/L; p = 0.047), and a lower concentration of uric acid (5.4 ± 0.9 vs. 7.0 ± 1.5 mg/dL; p = 0.04). Supplementation also resulted in performance maintenance (fatigue index) in the SJFT (Δ0.3 ± 2.0 vs Δ1.7 ± 2.5, for caffeine and placebo respectively, p = 0.046). No adverse effects were observed. Conclusion Based on the applied dose, intake time, and sample of this study, we can conclude that caffeine produces an ergogenic biochemical effect, and improves performance in judo athletes.


2016 ◽  
Vol 11 (6) ◽  
pp. 707-714 ◽  
Author(s):  
Benoit Capostagno ◽  
Michael I. Lambert ◽  
Robert P. Lamberts

Finding the optimal balance between high training loads and recovery is a constant challenge for cyclists and their coaches. Monitoring improvements in performance and levels of fatigue is recommended to correctly adjust training to ensure optimal adaptation. However, many performance tests require a maximal or exhaustive effort, which reduces their real-world application. The purpose of this review was to investigate the development and use of submaximal cycling tests that can be used to predict and monitor cycling performance and training status. Twelve studies met the inclusion criteria, and 3 separate submaximal cycling tests were identified from within those 12. Submaximal variables including gross mechanical efficiency, oxygen uptake (VO2), heart rate, lactate, predicted time to exhaustion (pTE), rating of perceived exertion (RPE), power output, and heart-rate recovery (HRR) were the components of the 3 tests. pTE, submaximal power output, RPE, and HRR appear to have the most value for monitoring improvements in performance and indicate a state of fatigue. This literature review shows that several submaximal cycle tests have been developed over the last decade with the aim to predict, monitor, and optimize cycling performance. To be able to conduct a submaximal test on a regular basis, the test needs to be short in duration and as noninvasive as possible. In addition, a test should capture multiple variables and use multivariate analyses to interpret the submaximal outcomes correctly and alter training prescription if needed.


2017 ◽  
Vol 39 (02) ◽  
pp. 115-123 ◽  
Author(s):  
Manuel Garnacho-Castaño ◽  
Raúl Domínguez ◽  
Arturo Muñoz González ◽  
Raquel Feliu-Ruano ◽  
Noemí Serra-Payá ◽  
...  

AbstractThe present study aimed to compare two fitness-training methodologies, instability circuit resistance training (ICRT) versus traditional circuit resistance training (TCRT), applying an experimental model of exercise prescription controlling and modulating exercise load using the Borg rating of perceived exertion. Forty-four healthy young adults age (21.6±2.3 years) were randomly assigned to three groups: TCRT (n=14), ICRT (n=14) and a control group (n=16). Strength and cardiorespiratory tests were chosen to evaluate cardiorespiratory and muscular fitness before and after the training program. In cardiorespiratory data, a significant difference was observed for the time effect in VO2max, peak heart rate, peak velocity, and heart rate at anaerobic threshold intensity (p<0.05) in the experimental groups. In strength variables, a significant Group x Time interaction effect was detected in 1RM, in mean propulsive power, and in peak power (p≤0.01) in the back squat exercise. In the bench press exercise, a significant time effect was detected in 1RM, in mean propulsive power, and in peak power, and a Group x Time interaction in peak power (all p<0.05). We can conclude that applying an experimental model of exercise prescription using RPE improved cardiorespiratory and muscular fitness in healthy young adults in both experimental groups.


Author(s):  
Markus N.C. Williams ◽  
Vincent J. Dalbo ◽  
Jordan L. Fox ◽  
Cody J. O’Grady ◽  
Aaron T. Scanlan

Purpose: To compare weekly training and game demands according to playing position in basketball players. Methods: A longitudinal, observational study was adopted. Semiprofessional, male basketball players categorized as backcourt (guards; n = 4) and frontcourt players (forwards/centers; n = 4) had their weekly workloads monitored across an entire season. External workload was determined using microsensors and included PlayerLoad™ (PL) and inertial movement analysis variables. Internal workload was determined using heart rate to calculate absolute and relative summated-heart-rate-zones workload and rating of perceived exertion (RPE) to calculate session-RPE workload. Comparisons between weekly training and game demands were made using linear mixed models and effect sizes in each positional group. Results: In backcourt players, higher relative PL (P = .04, very large) and relative summated-heart-rate-zones workload (P = .007, very large) were evident during training, while greater session-RPE workload (P = .001, very large) was apparent during games. In frontcourt players, greater PL (P < .001, very large), relative PL (P = .019, very large), peak PL intensities (P < .001, moderate), high-intensity inertial movement analysis events (P = .002, very large), total inertial movement analysis events (P < .001, very large), summated-heart-rate-zones workload (P < .001, very large), RPE (P < .001, very large), and session-RPE workload (P < .001, very large) were evident during games. Conclusions: Backcourt players experienced similar demands between training and games across several variables, with higher average workload intensities during training. Frontcourt players experienced greater demands across all variables during games than training. These findings emphasize the need for position-specific preparation strategies leading into games in basketball teams.


Sign in / Sign up

Export Citation Format

Share Document