The effect of ball temperature on ball speed and carry distance in golf drives

Author(s):  
Magnus Carlsson ◽  
Johnny Nilsson ◽  
John Hellström ◽  
Fredrik Tinmark ◽  
Tomas Carlsson

The purpose of this study was to investigate the effect of ball temperature on impact ball speed and carry distance during golf drives in a blind randomized test design. The balls were exposed to a temperature-controlled environment (4 °C, 18 °C, 32 °C, and 46 °C) for 24 h prior to the test and each temperature group consisted of 30 balls. The 120 drives were performed by an elite male golfer (handicap: 0.0) in an indoor driving range. All drives were measured by a Doppler-radar system to determine the club-head speed, launch angle, spin rate, ball speed, and carry distance. Differences between the groups were investigated using a one-way analysis of variance. The results indicated that ball-speed and carry-distance differences occurred within the four groups ( p < 0.001 and p < 0.01, respectively). The post hoc analyses showed that the ball temperatures of 18 °C and 32 °C had greater ball speeds and carry distances than balls at 4 °C and 46 °C (all p < 0.05). The intervals for the between-group differences were 0.6–0.7 m s−1 and 2.9–3.9 m for ball speed and carry distance, respectively. Hence, the results showed that ball temperature influences both the ball speed and the carry distance. Based on the findings in this study, standardization of ball temperature should be factored into governing body regulation tests for golf equipment.

2004 ◽  
Vol 18 (1) ◽  
pp. 13-26 ◽  
Author(s):  
Antoinette R. Miller ◽  
J. Peter Rosenfeld

Abstract University students were screened using items from the Psychopathic Personality Inventory and divided into high (n = 13) and low (n = 11) Psychopathic Personality Trait (PPT) groups. The P300 component of the event-related potential (ERP) was recorded as each group completed a two-block autobiographical oddball task, responding honestly during the first (Phone) block, in which oddball items were participants' home phone numbers, and then feigning amnesia in response to approximately 50% of items in the second (Birthday) block in which oddball items were participants' birthdates. Bootstrapping of peak-to-peak amplitudes correctly identified 100% of low PPT and 92% of high PPT participants as having intact recognition. Both groups demonstrated malingering-related P300 amplitude reduction. For the first time, P300 amplitude and topography differences were observed between honest and deceptive responses to Birthday items. No main between-group P300 effects resulted. Post-hoc analysis revealed between-group differences in a frontally located post-P300 component. Honest responses were associated with late frontal amplitudes larger than deceptive responses at frontal sites in the low PPT group only.


Author(s):  
Matthew R. Moreno ◽  
Karly A. Rodas ◽  
Ashley M. Bloodgood ◽  
J. Jay Dawes ◽  
Joseph M. Dulla ◽  
...  

This study captured heart rate (HR) responses of custody assistant (CA) recruits undertaking circuit training sessions. Data from 10 male and 12 female CA recruits were analyzed. Based on YMCA step test recovery HR, recruits were divided into higher fitness (HF; top 25%), lower fitness (LF; bottom 25%), and moderate fitness (MF; remaining recruits) groups. HR was measured during two circuit training sessions featuring calisthenics and running. HR zones were defined as: very light (<57% of age-predicted maximum heart-rate [HRmax]); light (57–63% HRmax); moderate (64–76% HRmax); vigorous (77–95% HRmax); and very vigorous (>95% HRmax). A one-way ANOVA, with Bonferroni post hoc, calculated between-group differences in time spent, and percentage of total time, in the HR zones. In session one, the LF group spent less time in the light training zone compared to the MF group, and more time in the very vigorous zone compared to the HF group (p = 0.027–0.047). In session two, the LF group spent more time in the moderate zone compared to both groups, and a greater percentage of time in the very vigorous zone compared to the MF group (p = 0.002–0.004). LF recruits generally worked harder during circuit training than their fitter counterparts, which supported recommendations for ability-based modifications.


2019 ◽  
Vol 65 (1) ◽  
pp. 21-29 ◽  
Author(s):  
Shuixia Guo ◽  
Ningning He ◽  
Zhening Liu ◽  
Zeqiang Linli ◽  
Haojuan Tao ◽  
...  

Background: The functional dysconnectivity observed from functional magnetic resonance imaging (fMRI) studies in schizophrenia is also seen in unaffected siblings indicating its association with the genetic diathesis. We intended to apportion resting-state dysconnectivity into components that represent genetic diathesis, clinical expression or treatment effect, and resilience. Methods: fMRI data were acquired from 28 schizophrenia patients, 28 unaffected siblings, and 60 healthy controls. Based on Dosenbach’s atlas, we extracted time series of 160 regions of interest. After constructing functional network, we investigated between-group differences in strength and diversity of functional connectivity and topological properties of undirected graphs. Results: Using analysis of variance, we found 88 dysconnectivities. Post hoc t tests revealed that 62.5% were associated with genetic diathesis and 21.6% were associated with clinical expression. Topologically, we observed increased degree, clustering coefficient, and global efficiency in the sibling group compared to both patients and controls. Conclusion: A large portion of the resting-state functional dysconnectivity seen in patients represents a genetic diathesis effect. The most prominent network-level disruption is the dysconnectivity among nodes of the default mode and salience networks. Despite their predisposition, unaffected siblings show a pattern of resilience in the emergent connectomic topology. Our findings could potentially help refine imaging genetics approaches currently used in the pursuit of the pathophysiology of schizophrenia.


2019 ◽  
Vol 34 (6) ◽  
pp. 1002-1002
Author(s):  
K Hassara ◽  
D Pulsipher ◽  
L Stanford ◽  
B Schneider ◽  
E Krapf

Abstract Objective This study seeks to examine whether personal psychiatric history (PPH) and/or family psychiatric history (FPH) are related to prolonged concussion recovery and increased post-concussive symptoms (PCs) in concussed children and adolescents. We hypothesized that individuals with PPH/FPH would endorse a greater number of and more severe PCs relative to those with concussion only or concussion with either PPH or FPH. Methods Data from 255 concussed 8 to 18-year-olds (median = 15.50 years, range = 10.25 years) were retrospectively examined from a clinical database excluding patients with confounding medical comorbidities. PCs (i.e., total symptom count and severity [frequency, intensity, and duration of symptoms]) were compared among four groups (concussion only [n = 80], concussion + PPH [n = 14], concussion + FPH [n = 125], and concussion + PPH/FPH [n = 36]) using a multivariate Kruskal-Wallis test and post-hoc Mann-Whitney U tests. Results The omnibus analysis indicated group differences for injury interval (p = 0.05) and PCs severity (p = 0.002). Post-hoc analyses indicated patients with concussion + PPH/FPH reported greater PCs severity than those with concussion only (U = 726.00, p = 0.0001, r = 0.36) and those with concussion + FPH (U = 1203.00, p = 0.003, r = 0.23). Injury interval was greater for patients with concussion + FPH than those with concussion alone (U = 3474.50, p = 0.007, r = 0.19). Other group differences were non-significant. Conclusions All groups reported a similar number of PCs. FPH contributes to severity of symptoms when combined with PPH. PPH alone did not significantly affect PCs severity. Findings suggest that providers should screen for both PPH and FPH at the time of concussion diagnosis. Early identification of risk factors may lead to targeted intervention, therefore reducing persistent PCs.


2008 ◽  
Vol 22 (6) ◽  
pp. 672-675 ◽  
Author(s):  
Mark G. Bowden ◽  
Chitralakshmi K. Balasubramanian ◽  
Andrea L. Behrman ◽  
Steven A. Kautz

Background. For clinical trials in stroke rehabilitation, self-selected walking speed has been used to stratify persons to predict functional walking status and to define clinical meaningfulness of changes. However, this stratification was validated primarily using self-report questionnaires. Objective. This study aims to validate the speed-based classification system with quantitative measures of walking performance. Methods. A total of 59 individuals who had hemiparesis for more than 6 months after stroke participated in this study. Spatiotemporal and kinetic measures included the percentage of total propulsion generated by the paretic leg (Pp), the percentage of the stride length accounted for by the paretic leg step length (PSR), and the percentage of the gait cycle spent in paretic preswing (PPS). Additional measures included the synergy portion of the Fugl-Meyer Assessment and the average number of steps/day in the home and community measured with a step activity monitor. Participants were stratified by self-selected gait speed into 3 groups: household (<0.4 m/s), limited community (0.4-0.8 m/s), and community (>0.8 m/s) ambulators. Group differences were analyzed using a Kruskal—Wallis H test with rank sums test post hoc analyses. Results. Analyses demonstrated a main effect in all measures, but only steps/day and PPS demonstrated a significant difference between all 3 groups. Conclusions. Classifying individuals poststroke by self-selected walking speed is associated with home and community-based walking behavior as quantified by daily step counts. In addition, PPS distinguishes all 3 groups. Pp differentiates the moderate from the fast groups and may represent a contribution to mechanisms of increasing walking speed. Speed classification presents a useful yet simple mechanism to stratify subjects poststroke and may be mechanically linked to changes in PPS.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Janina Wirtz ◽  
Leonie Ascone ◽  
Josefine Gehlenborg ◽  
Steffen Moritz ◽  
Simone Kühn

AbstractImaginal retraining is a variant of approach bias modification and transfers the method into one’s own mind. As the technique contains multiple elements, this pilot study aimed to dismantle which of its components is most efficient in reducing craving for high-calorie food. A total of 113 women were randomly allocated to one out of six conditions containing a short intervention to mentally manipulate a picture displaying high-calorie foods. Four of the interventions involved different combinations of elements of the imaginal retraining technique, while the remaining two conditions comprised thought suppression or merely observing a picture. Participants rated their level of craving, as well as three pictures containing healthy and unhealthy foods regarding their pleasantness before and after the interventions took place. Within-group changes were assessed with paired t-tests (in case of non-normal data Wilcoxon paired t-tests) and between-group differences with one-way ANOVAs (non-parametric Kruskal–Wallis tests). A trend level reduction in craving was found in the imaginal retraining condition with and without a movement. A post hoc analysis of both conditions joint together showed a statistically significant reduction in craving. In addition, positive picture appraisal for unhealthy foods was significantly reduced in both imaginal retraining conditions (with and without movement) with medium to large effect sizes. This study demonstrated that imaginal retraining with an arm movement can reduce craving and picture appraisal for high-calorie foods significantly in a one-time application. It is a promising technique to reduce appraisal for unhealthy high-calorie foods. Future studies should repeat the experiment in situations of high craving and allow for a personalized selection of stimuli.


2020 ◽  
Vol 22 (4) ◽  
pp. 23-33
Author(s):  
Won-Ho Choi ◽  
Yun-A Shin

OBJECTIVES Several studies have reported that weighted baseball (WB) training is effective in improving ball speed; however, the weight of the ball suitable for training remains unclear. This study aimed to investigate the changes in muscle activity during pitching using 5- to12-oz WBs and to provide basic data for training programs to improve pitching speed.METHODS The subjects of this study were 10 overhand pitchers who had more than 5 years of experience. Muscle activity was measured and analyzed at 70–85% of throwing baseball maximum effort (TBME) during soft toss (ST) and TBME was evaluated using electromyography.RESULTS As the ball weight increased, muscle activity also increased in all pitching phases. Muscle activity was higher during ST with WBs heavier than 10 or 11oz than during TBME, indicating that the loads on the shoulder and elbow joint muscles increased. Conversely, muscle activity during ST with 5- to 7-oz WBs was lower than that during TBME, although phase and muscle group differences were observed.CONCLUSIONS The results of this study suggest that training with 8- to 10-oz WBs could increase muscle strength and activity, although the effect may vary with fitness level and muscle strength.


2020 ◽  
Vol 63 (11) ◽  
pp. 3586-3593
Author(s):  
Abigail E. Haenssler ◽  
Xiangming Fang ◽  
Jamie L. Perry

Purpose Velopharyngeal (VP) ratios are commonly used to study normal VP anatomy and normal VP function. An effective VP (EVP) ratio may be a more appropriate indicator of normal parameters for speech. The aims of this study are to examine if the VP ratio is preserved across the age span or if it varies with changes in the VP portal and to analyze if the EVP ratio is more stable across the age span. Method Magnetic resonance imaging was used to analyze VP variables of 270 participants. For statistical analysis, the participants were divided into the following groups based on age: infants, children, adolescents, and adults. Analyses of variance and a Games–Howell post hoc test were used to compare variables between groups. Results There was a statistically significant difference ( p < .05) in all measurements between the age groups. Pairwise comparisons reported statistically significant adjacent group differences ( p < .05) for velar length, VP ratio, effective velar length, adenoid depth, and pharyngeal depth. No statistically significant differences between adjacent age groups were reported for the EVP ratio. Conclusions Results from this study report the EVP ratio was not statistically significant between adjacent age groups, whereas the VP ratio was statistically significant between adjacent age groups. This study suggests that the EVP ratio is more correlated to VP function than the VP ratio and provides a more stable and consistent ratio of VP function across the age span.


2021 ◽  
Vol 14 (1) ◽  
pp. 137-143
Author(s):  
Takeru Suzuki ◽  
John Patrick Sheahan ◽  
Taiki Miyazawa ◽  
Isao Okuda ◽  
Daisuke Ichikawa

Background: Golfers face different environmental conditions in each game played under various constraints. Enhancing affordances through training in a constrained outdoor environment is crucial. Objective: To analyze club head behavior at ball impact of a tee shot by 42 professional (PGs) and 25 amateur (AGs) golfers in swinging to uphill and downhill fairway environments using the TrackMan portable launch monitor. Methods: We used TrackMan to compare golf club movement adaptations in 42 PGs and 25 AGs. A 330-m driving range facing both the uphill (+5°) and downhill (-5°) fairways were used. The tee shot area was the only flat ground surface, with the uneven ground between the shot area and the 200-yard fairway. Results: The clubhead speed and attack angle were significantly higher among PGs than among AGs. PGs could adapt their swings to the uphill fairway by increasing the attack angle (3.6°±2.4) by 3.3° compared with the downhill fairway. The attack angle did not correlate with the launch angle among the AGs in the downhill condition, suggesting that they were unable to control the height of the ball based on the far side of the fairway. Conclusion: PGs increased the attack angle in uphill conditions, and their awareness of the affordance, which was different from that of AGs, allowed them to change the optimal ball trajectory to avoid perceived fairway risks. Thus, the more skill a player had, the better he was at recognizing the affordance of the visual field. PGs demonstrated a better ability to adapt to environmental constraints.


Author(s):  
Chantelle Rigozzi ◽  
Jeremy Cox ◽  
Gareth A Vio ◽  
William L Martens ◽  
Philip Poronnik

Elbow tendinopathy injuries are very common in tennis players. One of the commonly accepted theories describing the development of elbow tendinopathy in tennis is based on stiffness of the forearm skeletal muscle units and their repetitive overuse in the forehand stroke. Our objective was to use a novel microcontroller based wearable device to compare the influence of different forehand spin levels (flat, topspin and lob) and ball exit speed on forearm muscle activity in the potential onset of elbow tendinopathy in experienced adult tennis players. Peak normalised extensor carpi radialis (ECR) and flexor carpi radialis (FCR) muscle activity corresponding to each forehand shot and ball exit speed were determined and analysed. For the ECR shots (flat = 121, topspin = 272 and lob = 273) by 8 players, Kruskal-Wallis test (p < 0.001) and Post-Hoc tests revealed a significant difference between the flat and topspin spin levels (p < 0.01) and flat and lob spin levels (p < 0.001). For the FCR shots (flat = 125, topspin = 301 and lob = 303) by 9 players, Kruskal-Wallis test showed no significant difference between the three spin levels. For the corresponding ball speed, the Kruskal-Wallis (p < 0.001) and subsequent Post-Hoc (p < 0.001) showed that flat hits had the significantly highest ball speed followed by topspin then lob accordingly for both muscles included shots. Our results suggest that coaches could consider recommending players to hit forehands with topspin in order to potentially reduce the risk of developing lateral elbow tendinopathy.


Sign in / Sign up

Export Citation Format

Share Document