Trends in match injury risk in professional male rugby union: a 16-season review of 10 851 match injuries in the English Premiership (2002–2019): the Professional Rugby Injury Surveillance Project

2020 ◽  
pp. bjsports-2020-102529
Author(s):  
Stephen W West ◽  
Lindsay Starling ◽  
Simon Kemp ◽  
Sean Williams ◽  
Matthew Cross ◽  
...  

ObjectivesThe Professional Rugby Injury Surveillance Project is the largest and longest running rugby union injury surveillance project globally and focuses on the highest level of rugby in England.MethodsWe examined match injuries in professional men’s rugby over the period 2002/2003 to 2018/2019 and described trends in injuries over this time.ResultsOver the period 2002/2003–2018/2019, 10 851 injuries occurred in 1 24 952 hours of match play, equating to a mean of 57 injuries per club per season and one injury per team per match. The mean incidence, severity (days absence) and burden (days absence/1000 hours) of injury were 87/1000 hours (95% CI 82 to 92), 25 days (95% CI 22 to 28) and 2178 days/1000 hours (95% CI 1872 to 2484), respectively. The tackle accounted for 43% injuries with running the second most common activity during injury (12%). The most common injury location was the head/face with an incidence of 11.3/1000 hours, while the location with the highest overall burden was the knee (11.1 days/1000 hours). Long-term trends demonstrated stable injury incidence and proportion of injured players, but an increase in the mean and median severity of injuries. Concussion incidence, severity and burden increased from the 2009/2010 season onwards and from 2011 to 2019 concussion was the most common injury.ConclusionThe rise in overall injury severity and concussion incidence are the most significant findings from this work and demonstrate the need for continued efforts to reduce concussion risk as well as a greater understanding of changes in injury severity over time.

2019 ◽  
Vol 7 (4) ◽  
pp. 232596711983764 ◽  
Author(s):  
Daniel T. Hoffman ◽  
Dan B. Dwyer ◽  
Jacqueline Tran ◽  
Patrick Clifton ◽  
Paul B. Gastin

Background: Injury surveillance has been used to quantify the scope of the injury burden in Australian football. However, deeper statistical analyses are required to identify major factors that contribute to the injury risk and to understand how these injury patterns change over time. Purpose: To compare Australian Football League (AFL) injury incidence, severity, prevalence, and recurrence by setting, site, and time span from 1997 to 2016. Study Design: Descriptive epidemiology study. Methods: A total of 15,911 injuries and medical illnesses recorded by team medical staff at each club were obtained from the AFL’s injury surveillance system and analyzed using linear mixed models with 3 fixed effects (setting, time span, site) and 1 random effect (club). All types of injuries and medical illnesses were included for analysis, provided that they caused the player to miss at least 1 match during the regular season or finals. Five-season time spans (1997-2001, 2002-2006, 2007-2011, and 2012-2016) were used for comparisons. Incidence rates were expressed at the player level. Recurrences were recoded to quantify recurrent injuries across multiple seasons. Results: Compared with training injuries, match injuries had a 2.8 times higher incidence per season per club per player (matches: 0.070 ± 0.093; training: 0.025 ± 0.043; P < .001). Match injuries resulted in 1.9 times more missed matches per club per season (matches: 17.2 ± 17.0; training: 9.1 ± 10.5; P < .001). and were more likely to be recurrences (matches: 11.6% ± 20.0%; training: 8.6% ± 21.8%; P < .001). From the 1997-2001 to 2007-2011 time spans, overall injury severity increased from a mean of 3.2 to 3.7 missed matches ( P ≤ .01). For the most recent 2012-2016 time span, injuries resulted in 3.6 missed matches, on average. Hip/groin/thigh injuries had the highest incidence (0.125 ± 0.120) and prevalence (19.2 ± 16.4) rates, and recurrences (29.3% ± 27.9%) were 15% more likely at this site than any other injury site. Conclusion: The risks of match injuries are significantly higher than those of training injuries in the AFL. Compared with the 1997-2001 time span, injuries became more severe during the 2007-2011 time span.


2008 ◽  
Vol 26 (8) ◽  
pp. 2069-2080 ◽  
Author(s):  
N. B. Gudadze ◽  
G. G. Didebulidze ◽  
L. N. Lomidze ◽  
G. Sh. Javakhishvili ◽  
M. A. Marsagishvili ◽  
...  

Abstract. Long-term observations of total nightglow intensity of the atomic oxygen red 630.0 nm line at Abastumani (41.75° N, 42.82° E) in 1957–1993 and measurements of the ionosphere F2 layer parameters from the Tbilisi ionosphere station (41.65° N, 44.75° E) in 1963–1986 have been analyzed. It is shown that a decrease in the long-term trend of the mean annual red 630.0 nm line intensity from the pre-midnight value (+0.770±1.045 R/year) to its minimum negative value (−1.080±0.670 R/year) at the midnight/after midnight is a possible result of the observed lowering of the peak height of the ionosphere F2 layer electron density hmF2 (−0.455±0.343 km/year). A theoretical simulation is carried out using a simple Chapman-type layer (damping in time) for the height distribution of the F2 layer electron density. The estimated values of the lowering in the hmF2, the increase in the red line intensity at pre-midnight and its decrease at midnight/after midnight are close to their observational ones, when a negative trend in the total neutral density of the upper atmosphere and an increase in the mean northward wind (or its possible consequence – a decrease in the southward one) are assumed.


2020 ◽  
Author(s):  
Shadi Zabad ◽  
Alan M Moses

AbstractWe study the evolution of quantitative molecular traits in the absence of selection. Using a simple theory based on Felsenstein’s 1981 DNA substitution model, we predict a linear restoring force on the mean of an additive phenotype. Remarkably, the mean dynamics are independent of the effect sizes and genotype and are similar to the widely-used OU model for stabilizing selection. We confirm the predictions empirically using additive molecular phenotypes calculated from ancestral reconstructions of putatively unconstrained DNA sequences in primate genomes. We show that the OU model is favoured by inference software even when applied to GC content of unconstrained sequences or simulations of DNA evolution. We predict and confirm empirically that the dynamics of the variance are more complicated than those predicted by the OU model, and show that our results for the restoring force of mutation hold even for non-additive phenotypes, such as number of transcription factor binding sites, longest encoded peptide and folding propensity of the encoded peptide. Our results have implications for efforts to infer selection based on quantitative phenotype dynamics as well as to understand long-term trends in evolution of quantitative molecular traits.


2018 ◽  
Vol 158 (6) ◽  
pp. 1028-1034 ◽  
Author(s):  
Neil Pathak ◽  
Rance J. T. Fujiwara ◽  
Saral Mehra

Objective To characterize, describe, and compare nonresearch industry payments made to otolaryngologists in 2014 and 2015. Additionally, to describe industry payment variation within otolaryngology and among other surgical specialties. Study Design Retrospective cross-sectional database analysis. Setting Open Payments Database. Subjects and Methods Nonresearch payments made to US otolaryngologists were characterized and compared by payment amount, nature of payment, sponsor, and census region between 2014 and 2015. Payments in otolaryngology were compared with those in other surgical specialties. Results From 2014 to 2015, there was an increase in the number of compensated otolaryngologists (7903 vs 7946) and in the mean payment per compensated otolaryngologist ($1096 vs $1242), as well as a decrease in the median payment per compensated otolaryngologist ($169 vs $165, P = .274). Approximately 90% of total payments made in both years were attributed to food and beverage. Northeast census region otolaryngologists received the highest median payment in 2014 and 2015. Compared with other surgical specialists, otolaryngologists received the lowest mean payment in 2014 and 2015 and the second-lowest and lowest median payment in 2014 and 2015, respectively. Conclusion The increase in the mean payment and number of compensated otolaryngologists can be explained by normal annual variation, stronger industry-otolaryngologist relationships, or improved reporting; additional years of data and improved public awareness of the Sunshine Act will facilitate determining long-term trends. The large change in disparity between the mean and median from 2014 to 2015 suggests greater payment variation. Otolaryngologists continue to demonstrate limited industry ties when compared with other surgical specialists.


2009 ◽  
Vol 67 (2) ◽  
pp. 304-315 ◽  
Author(s):  
Becky Sjare ◽  
Garry B. Stenson

Abstract Sjare, B., and Stenson, G. B. 2010. Changes in the reproductive parameters of female harp seals (Pagophilus groenlandicus) in the Northwest Atlantic. – ICES Journal of Marine Science, 67: 304–315. Changes in female harp seal (Pagophilus groenlandicus) reproductive parameters from 1980 to 2004, and long-term trends since the early 1950s, are evaluated. Estimates of the total number of seals in the Northwest Atlantic declined from ∼3.0 million in the 1950s to 1.8 million in the early 1970s, then increased steadily to 5.5 million in 1996, at which relatively stable level it has remained since. Pregnancy rates increased from ∼86% in the 1950s to a high of 98% in the mid-1960s, then declined to ∼65–70% by the early 1990s; the rate then varied between 45 and 70% from 2000 to 2004. Concurrently, the mean age at sexual maturity decreased from 5.8 (s.e = 0.02) years in the mid-1950s to 4.1 (s.e. = 0.02) in the late 1970s, increased to 5.5 (s.e. = 0.03) years by the early 1990s, and peaked at 5.7 (s.e. = 0.01) in 1995. From 2000 to 2004, mean age varied from 4.9 (s.e. = 0.01) to 6.0 (s.e. = 0.01) years. Although the direction of change in each of the parameters was consistent with a density-dependent response, changes in population size explained relatively little of the variability observed, suggesting that other ecological or environmental factors were influential.


2017 ◽  
Vol 34 (9) ◽  
pp. 1947-1961 ◽  
Author(s):  
Marlos Goes ◽  
Elizabeth Babcock ◽  
Francis Bringas ◽  
Peter Ortner ◽  
Gustavo Goni

AbstractExpendable bathythermograph (XBT) data provide one of the longest available records of upper-ocean temperature. However, temperature and depth biases in XBT data adversely affect estimates of long-term trends of ocean heat content and, to a lesser extent, estimates of volume and heat transport in the ocean. Several corrections have been proposed to overcome historical biases in XBT data, which rely on constantly monitoring these biases. This paper provides an analysis of data collected during three recent hydrographic cruises that utilized different types of probes, and examines methods to reduce temperature and depth biases by improving the thermistor calibration and reducing the mass variability of the XBT probes.The results obtained show that the use of individual thermistor calibration in XBT probes is the most effective calibration to decrease the thermal bias, improving the mean thermal bias to less than 0.02°C and its tolerance from 0.1° to 0.03°C. The temperature variance of probes with screened thermistors is significantly reduced by approximately 60% in comparison to standard probes. On the other hand, probes with a tighter weight tolerance did not show statistically significant reductions in the spread of depth biases, possibly because of the small sample size or the sensitivity of the depth accuracy to other causes affecting the analysis.


Author(s):  
W Viljoen ◽  
CJ Saunders ◽  
GD Hechter ◽  
KD Aginsky ◽  
HB Millson

Objective. To describe the incidence of injuries in a professional rugby team, and to identify any associations between injury rates and training volume.Methods. This retrospective, descriptive study included all injuries diagnosed as grade 1 and above in a South African Super 12 rugby team. Injury incidence and injury rates were calculated and compared with training volume and hours of match play.Results. Thirty-eight male rugby players were injured during the study period. The total number of annual injuries decreased from 50 (2002) to 38 (2004) (χ2=0.84, p=0.36). The number of new injuries showed a similar trend (χ2=2.81, p=0.09), while the number of recurring injuries increased over the 3-year period. There was a tendency for total in-season injury rates to decrease over the 3 years (χ2=2.89, p=0.09). The pre-season injury rate increased significantly over the 3 years (χ2=12.7, pConclusions. One has to be cognisant of the balance between performance improvement and injury risk when designing training programmes for elite rugby players. Although the reduction in training volume was associated with a slight reduction in the number of acute injuries and in-season injury rates over the three seasons, the performance of the team changed from 3rd to 7th (2002 and 2004, respectively). Further studies are required to determine the optimal training necessary to improve rugby performancewhile reducing injury rates.


2017 ◽  
Author(s):  
Gabriele P. Stiller ◽  
Federico Fierli ◽  
Felix Ploeger ◽  
Chiara Cagnazzo ◽  
Bernd Funke ◽  
...  

Abstract. In response to global warming the Brewer–Dobson circulation in the stratosphere is expected to accelerate and the mean transport time of air along this circulation to decrease. This would imply a negative stratospheric age of air trend, i.e. an air parcel would need less time to travel from the tropopause to any point in the stratosphere. Age of air as inferred from tracer observations, however, shows zero to positive trends in the Northern midlatitude stratosphere and zonally asymmetric patterns. Using satellite 5 observations and model calculations we show that the observed latitudinal and vertical patterns of the decadal changes of age of air in the lower to middle stratosphere during 2002–2012 are predominantly caused by a southward shift of the circulation pattern of about 5 degrees. After correction for this shift, the observations reveal a hemispherically almost symmetric decrease of age of air in the lower to middle stratosphere up to 800 K of up to −0.25 years over the 2002–2012 period with strongest decrease in the Northern tropics. This net change is consistent with long-term trends from model predictions.


2020 ◽  
Vol 185 (9-10) ◽  
pp. e1461-e1471 ◽  
Author(s):  
Joseph M Molloy ◽  
Timothy L Pendergrass ◽  
Ian E Lee ◽  
Michelle C Chervak ◽  
Keith G Hauret ◽  
...  

Abstract Introduction Noncombat injuries (“injuries”) greatly impact soldier health and United States (U.S.) Army readiness; they are the leading cause of outpatient medical encounters (more than two million annually) among active component (AC) soldiers. Noncombat musculoskeletal injuries (“MSKIs”) may account for nearly 60% of soldiers’ limited duty days and 65% of soldiers who cannot deploy for medical reasons. Injuries primarily affect readiness through increased limited duty days, decreased deployability rates, and increased medical separation rates. MSKIs are also responsible for exorbitant medical costs to the U.S. government, including service-connected disability compensation. A significant subset of soldiers develops chronic pain or long-term disability after injury; this may increase their risk for chronic disease or secondary health deficits potentially associated with MSKIs. The authors will review trends in U.S. Army MSKI rates, summarize MSKI readiness-related impacts, and highlight the importance of standardizing surveillance approaches, including injury definitions used in injury surveillance. Materials/Methods This review summarizes current reports and U.S. Department of Defense internal policy documents. MSKIs are defined as musculoskeletal disorders resulting from mechanical energy transfer, including traumatic and overuse injuries, which may cause pain and/or limit function. This review focuses on various U.S. Army populations, based on setting, sex, and age; the review excludes combat or battle injuries. Results More than half of all AC soldiers sustained at least one injury (MSKI or non-MSKI) in 2017. Overuse injuries comprise at least 70% of all injuries among AC soldiers. Female soldiers are at greater risk for MSKI than men. Female soldiers’ aerobic and muscular fitness performances are typically lower than men’s performances, which could account for their higher injury rates. Older soldiers are at greater injury risk than younger soldiers. Soldiers in noncombat arms units tend to have higher incidences of reported MSKIs, more limited duty days, and higher rates of limited duty days for chronic MSKIs than soldiers in combat arms units. MSKIs account for 65% of medically nondeployable AC soldiers. At any time, 4% of AC soldiers cannot deploy because of MSKIs. Once deployed, nonbattle injuries accounted for approximately 30% of all medical evacuations, and were the largest category of soldier evacuations from both recent major combat theaters (Iraq and Afghanistan). More than 85% of service members medically evacuated for MSKIs failed to return to the theater. MSKIs factored into (1) nearly 70% of medical disability discharges across the Army from 2011 through 2016 and (2) more than 90% of disability discharges within enlisted soldiers’ first year of service from 2010 to 2015. MSKI-related, service-connected (SC) disabilities account for 44% of all SC disabilities (more than any other body system) among compensated U.S. Global War on Terrorism veterans. Conclusions MSKIs significantly impact soldier health and U.S. Army readiness. MSKIs also figure prominently in medical disability discharges and long-term, service-connected disability costs. MSKI patterns and trends vary between trainees and soldiers in operational units and among military occupations and types of operational units. Coordinated injury surveillance efforts are needed to provide standardized metrics and accurately measure temporal changes in injury rates.


2017 ◽  
Vol 17 (18) ◽  
pp. 11177-11192 ◽  
Author(s):  
Gabriele P. Stiller ◽  
Federico Fierli ◽  
Felix Ploeger ◽  
Chiara Cagnazzo ◽  
Bernd Funke ◽  
...  

Abstract. In response to global warming, the Brewer–Dobson circulation in the stratosphere is expected to accelerate and the mean transport time of air along this circulation to decrease. This would imply a negative stratospheric age of air trend, i.e. an air parcel would need less time to travel from the tropopause to any point in the stratosphere. Age of air as inferred from tracer observations, however, shows zero to positive trends in the northern mid-latitude stratosphere and zonally asymmetric patterns. Using satellite observations and model calculations we show that the observed latitudinal and vertical patterns of the decadal changes of age of air in the lower to middle stratosphere during the period 2002–2012 are predominantly caused by a southward shift of the circulation pattern by about 5°. After correction for this shift, the observations reveal a hemispherically almost symmetric decrease of age of air in the lower to middle stratosphere up to 800 K of up to −0.25 years over the 2002–2012 period with strongest decrease in the northern tropics. This net change is consistent with long-term trends from model predictions.


Sign in / Sign up

Export Citation Format

Share Document