scholarly journals Ambient temperature and subsequent COVID-19 mortality in the OECD countries and individual United States

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Costas A. Christophi ◽  
Mercedes Sotos-Prieto ◽  
Fan-Yun Lan ◽  
Mario Delgado-Velandia ◽  
Vasilis Efthymiou ◽  
...  

AbstractEpidemiological studies have yielded conflicting results regarding climate and incident SARS-CoV-2 infection, and seasonality of infection rates is debated. Moreover, few studies have focused on COVD-19 deaths. We studied the association of average ambient temperature with subsequent COVID-19 mortality in the OECD countries and the individual United States (US), while accounting for other important meteorological and non-meteorological co-variates. The exposure of interest was average temperature and other weather conditions, measured at 25 days prior and 25 days after the first reported COVID-19 death was collected in the OECD countries and US states. The outcome of interest was cumulative COVID-19 mortality, assessed for each region at 25, 30, 35, and 40 days after the first reported death. Analyses were performed with negative binomial regression and adjusted for other weather conditions, particulate matter, sociodemographic factors, smoking, obesity, ICU beds, and social distancing. A 1 °C increase in ambient temperature was associated with 6% lower COVID-19 mortality at 30 days following the first reported death (multivariate-adjusted mortality rate ratio: 0.94, 95% CI 0.90, 0.99, p = 0.016). The results were robust for COVID-19 mortality at 25, 35 and 40 days after the first death, as well as other sensitivity analyses. The results provide consistent evidence across various models of an inverse association between higher average temperatures and subsequent COVID-19 mortality rates after accounting for other meteorological variables and predictors of SARS-CoV-2 infection or death. This suggests potentially decreased viral transmission in warmer regions and during the summer season.

Author(s):  
Nadir Yehya ◽  
Atheendar Venkataramani ◽  
Michael O Harhay

ABSTRACT Background Social distancing is encouraged to mitigate viral spreading during outbreaks. However, the association between distancing and patient-centered outcomes in Covid-19 has not been demonstrated. In the United States social distancing orders are implemented at the state level with variable timing of onset. Emergency declarations and school closures were two early statewide interventions. Methods To determine whether later distancing interventions were associated with higher mortality, we performed a state-level analysis in 55,146 Covid-19 non-survivors. We tested the association between timing of emergency declarations and school closures with 28-day mortality using multivariable negative binomial regression. Day 1 for each state was set to when they recorded ≥ 10 deaths. We performed sensitivity analyses to test model assumptions. Results At time of analysis, 37 of 50 states had ≥ 10 deaths and 28 follow-up days. Both later emergency declaration (adjusted mortality rate ratio [aMRR] 1.05 per day delay, 95% CI 1.00 to 1.09, p=0.040) and later school closure (aMRR 1.05, 95% CI 1.01 to 1.09, p=0.008) were associated with more deaths. When assessing all 50 states and setting day 1 to the day a state recorded its first death, delays in declaring an emergency (aMRR 1.05, 95% CI 1.01 to 1.09, p=0.020) or closing schools (aMRR 1.06, 95% CI 1.03 to 1.09, p<0.001) were associated with more deaths. Results were unchanged when excluding New York and New Jersey. Conclusions Later statewide emergency declarations and school closure were associated with higher Covid-19 mortality. Each day of delay increased mortality risk 5 to 6%.


2021 ◽  
pp. 140349482110027
Author(s):  
Tea Lallukka ◽  
Rahman Shiri ◽  
Kristina Alexanderson ◽  
Jenni Ervasti ◽  
Ellenor Mittendorfer-Rutz ◽  
...  

Aims: The aim of this study was to examine sickness absence and disability pension (SA/DP) during working lifespan among individuals diagnosed with carpal tunnel syndrome (CTS) and their matched references, accounting for sociodemographic factors. Methods: We used a register cohort of 78,040 individuals aged 19–60 years when diagnosed with CTS in secondary health care (hospitals and outpatient specialist health care) and their 390,199 matched references from the general population in 2001–2010. Sociodemographic factors and SA/DP net days during a three-year follow-up were included. Negative binomial regression was used. Results: For those not on DP at inclusion, the average number of SA/DP days per person-year was 58 days (95% confidence interval (CI) 56–60 days) among individuals with CTS and 20 days (95% CI 19–21 days) among the matched references. Among both groups, these numbers increased with age and were higher among women than among men. The rate ratio (RR) of SA/DP days was threefold higher among people with CTS than among the matched references (adjusted RR=3.00, 95% CI 2.91–3.10) Moreover, compared to the matched references, the RR for SA/DP was higher among men with CTS (RR=3.86, 95% CI 3.61–4.13) than among women with CTS (RR=2.69, 95% CI 2.59–2.78). The association between CTS and the number of SA/DP days was smaller among older age groups. Sociodemographic factors were similarly associated with SA/DP among people with and without CTS. Conclusions: Numbers of SA/DP days were higher among people with CTS than their matched references in all age groups, particularly among individuals in their early work careers, highlighting public-health relevance of the findings.


2016 ◽  
Vol 10 (5-6) ◽  
pp. 172 ◽  
Author(s):  
Blayne Welk ◽  
Jennifer Winick-Ng ◽  
Andrew McClure ◽  
Chris Vinden ◽  
Sumit Dave ◽  
...  

Introduction: The ability of academic (teaching) hospitals to offer the same level of efficiency as non-teaching hospitals in a publicly funded healthcare system is unknown. Our objective was to compare the operative duration of general urology procedures between teaching and non-teaching hospitals. Methods: We used administrative data from the province of Ontario to conduct a retrospective cohort study of all adults who underwent a specified elective urology procedure (2002–2013). Primary outcome was duration of surgical procedure. Primary exposure was hospital type (academic or non-teaching). Negative binomial regression was used to adjust relative time estimates for age, comorbidity, obesity, anesthetic, and surgeon and hospital case volume.Results: 114 225 procedures were included (circumcision n=12 280; hydrocelectomy n=7221; open radical prostatectomy n=22 951; transurethral prostatectomy n=56 066; or mid-urethral sling n=15 707). These procedures were performed in an academic hospital in 14.8%, 13.3%, 28.6%, 17.1%, and 21.3% of cases, respectively. The mean operative duration across all procedures was higher in academic centres; the additional operative time ranged from 8.3 minutes (circumcision) to 29.2 minutes (radical prostatectomy). In adjusted analysis, patients treated in academic hospitals were still found to have procedures that were significantly longer (by 10‒21%). These results were similar in sensitivity analyses that accounted for the potential effect of more complex patients being referred to tertiary academic centres.Conclusions: Five common general urology operations take significantly longer to perform in academic hospitals. The reason for this may be due to the combined effect of teaching students and residents or due to inherent systematic inefficiencies within large academic hospitals.


2017 ◽  
Vol 54 (5) ◽  
pp. 639-679 ◽  
Author(s):  
Eric R. Louderback ◽  
Olena Antonaccio

Objectives: Investigate the relationship between thoughtfully reflective decision-making (TRDM) and computer-focused cyber deviance involvement and computer-focused cybercrime victimization. Method: Survey data collected from samples of 1,039 employees and 418 students at a large private university were analyzed using ordinary least squares and negative binomial regression to test the effects of TRDM on computer-focused cyber deviance involvement and victimization. Results: TRDM reduces computer-focused cyber deviance involvement and computer-focused cybercrime victimization across measures and samples. The sensitivity analyses also indicated that TRDM is a more robust predictor of cyber deviance involvement than victimization. The results from moderation analyses showed that, whereas protective effects of TRDM are invariant across genders, they are less salient among older employees for the scenario-based measure of cybercrime victimization. Conclusions: Individual-level cognitive decision-making processes are important in predicting computer-focused cyber deviance involvement and victimization. These results can inform the development of targeted institutional and criminal justice policies aimed at reducing computer-focused cybercrime.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Sarah Simmons ◽  
Grady Wier ◽  
Antonio Pedraza ◽  
Mark Stibich

Abstract Background The role of the environment in hospital acquired infections is well established. We examined the impact on the infection rate for hospital onset Clostridioides difficile (HO-CDI) of an environmental hygiene intervention in 48 hospitals over a 5 year period using a pulsed xenon ultraviolet (PX-UV) disinfection system. Methods Utilization data was collected directly from the automated PX-UV system and uploaded in real time to a database. HO-CDI data was provided by each facility. Data was analyzed at the unit level to determine compliance to disinfection protocols. Final data set included 5 years of data aggregated to the facility level, resulting in a dataset of 48 hospitals and a date range of January 2015–December 2019. Negative binomial regression was used with an offset on patient days to convert infection count data and assess HO-CDI rates vs. intervention compliance rate, total successful disinfection cycles, and total rooms disinfected. The K-Nearest Neighbor (KNN) machine learning algorithm was used to compare intervention compliance and total intervention cycles to presence of infection. Results All regression models depict a statistically significant inverse association between the intervention and HO-CDI rates. The KNN model predicts the presence of infection (or whether an infection will be present or not) with greater than 98% accuracy when considering both intervention compliance and total intervention cycles. Conclusions The findings of this study indicate a strong inverse relationship between the utilization of the pulsed xenon intervention and HO-CDI rates.


2019 ◽  
Vol 46 (9) ◽  
pp. 1134-1140 ◽  
Author(s):  
Jasvinder A. Singh ◽  
Shaohua Yu ◽  
Lang Chen ◽  
John D. Cleveland

Objective.To project future total hip and knee joint arthroplasty (THA, TKA) use in the United States to 2040.Methods.We used the 2000–2014 US National Inpatient Sample (NIS) combined with Census Bureau data to develop projections for primary THA and TKA from 2020 to 2040 using polynomial regression to account for the nonlinearity and interactions between the variables, assuming the underlying distribution of the number of THA/TKA to be Poisson distributed. We performed sensitivity analyses using a negative binomial regression to account for overdispersion.Results.Predicted total annual counts (95% prediction intervals) for THA in the United States by 2020, 2025, 2030, and 2040 are (in thousands): 498 (475, 523), 652 (610, 696), 850 (781, 925), and 1429 (1265, 1615), respectively. For primary TKA, predicted total annual counts for 2020, 2025, 2030, and 2040 are (in thousands): 1065 (937, 1211), 1272 (1200, 1710), 1921 (1530, 2410), and 3416 (2459, 4745), respectively. Compared to the available 2014 NIS numbers, the percent increases in projected total annual US use for primary THA and TKA in 2020, 2025, 2030, and 2040 are as follows: primary THA, by 34%, 75%, 129%, and 284%; and primary TKA, 56%, 110%, 182%, and 401%, respectively. Primary THA and TKA use is projected to increase for both females and males, in all age groups.Conclusion.Significant increases in use of THA and TKA are expected in the United States in the future, if the current trend continues. The increased use is evident across age groups in both females and males. A policy change may be needed to meet increased demand.


2018 ◽  
Vol 10 (8) ◽  
pp. 2720 ◽  
Author(s):  
Yuanyuan Zhang ◽  
Yuming Zhang

To improve the sustainability and efficiency of transport systems, communities and government agencies throughout the United States (US) are looking for ways to reduce vehicle ownership and single-occupant trips by encouraging people to shift from driving to using more sustainable transport modes (such as ridesharing). Ridesharing is a cost-effective, sustainable and effective alternative transportation mode that is beneficial to the environment, the economy and society. Despite the potential effect of vehicle ownership on the adoption of ridesharing services, individuals’ ridesharing behaviors and the interdependencies between vehicle ownership and ridesharing usage are not well understood. This study aims to fill the gap by examining the associations between household vehicle ownership and the frequency and probability of ridesharing usage, and to estimate the effects of household vehicle ownership on individuals’ ridesharing usage in the US. We conducted zero-inflated negative binomial regression models using data from the 2017 National Household Travel Survey. The results show that, in general, one-vehicle reduction in households was significantly associated with a 7.9% increase in the frequency of ridesharing usage and a 23.0% increase in the probability of ridesharing usage. The effects of household vehicle ownership on the frequency of ridesharing usage are greater for those who live in areas with a higher population density than those living in areas with a lower population density. Young people, men, those who are unable to drive, individuals with high household income levels, and those who live in areas with rail service or a higher population density, tend to use ridesharing more frequently and are more likely to use it. These findings can be used as guides for planners or practitioners to better understand individuals’ ridesharing behaviors, and to identify policies and interventions to increase the potential of ridesharing usage, and to decrease household vehicle ownership, depending on different contextual features and demographic variables. Comprehensive strategies that limit vehicle ownership and address the increasing demand for ridesharing have the potential to improve the sustainability of transportation systems.


2016 ◽  
Vol 56 (10) ◽  
pp. 1683 ◽  
Author(s):  
O. Blumetto ◽  
A. Ruggia ◽  
A. Dalmau ◽  
F. Estellés ◽  
A. Villagrá

The objective of the present study was to characterise the behaviour of Holstein steers in three different production systems. Forty-eight castrated Holstein males were randomly divided into three groups and allocated to the following three outdoor treatments: (T1) animals confined in a yard with an area of 210 m2, (T2) animals confined in a similar-sized yard but with 6 h of access to a pasture plot, (T3) animals maintained throughout the experiment on a pasture plot. Behaviour was recorded by scan sampling, 12 h a day (from 0700 hours to 1900 hours), 3 days per week, for 4 weeks evenly distributed from Week 7 to Week 16 of the experiment. So as to assess their patterns of behaviour, a negative binomial regression, correspondence analysis and logistic regressions were performed. Grazing was the predominant behaviour among Groups T2 and T3, while ‘eating hay’ was the most frequent behaviour among Group T1. For all treatments, lying was the second-most frequent behaviour. Despite animals in T2 having access to pasture for only half of the time with respect to those in T3, there was no difference between both treatments in the time spent grazing. Correspondence analysis of behaviour as a function of weather conditions showed that several behaviours were close to certain weather conditions, e.g. ‘standing’ and ‘ruminating while standing’ were closer to light rainy weather, while ‘lying’ or ‘ruminating while lying’ were more related to sunny weather.’Lying’ tended to increase along the day in all treatments, while ‘eating hay’ increased along the day within Group T1, but decreased within Groups T2 and T3. It is concluded that the management conditions associated with the systems that were studied produced different behavioural patterns in the steers. Grazing behaviour is important for the animals, and the permanent or restricted possibility to perform it, determined by the production system, meant that the patterns of other behaviours changed to give priority to pasture intake.


2015 ◽  
Vol 144 (8) ◽  
pp. 1792-1802 ◽  
Author(s):  
J. E. PAINTER ◽  
J. W. GARGANO ◽  
J. S. YODER ◽  
S. A. COLLIER ◽  
M. C. HLAVSA

SUMMARYCryptosporidiumis the leading aetiology of waterborne disease outbreaks in the United States. This report briefly describes the temporal and geographical distribution of US cryptosporidiosis cases and presents analyses of cryptosporidiosis case data reported in the United States for 1995–2012. The Cochran–Armitage test was used to assess changes in the proportions of cases by case status (confirmedvs.non-confirmed), sex, race, and ethnicity over the study period. Negative binomial regression models were used to estimate rate ratios (RR) and 95% confidence intervals (CI) for comparing rates across three time periods (1995–2004, 2005–2008, 2009–2012). The proportion of confirmed cases significantly decreased (P< 0·0001), and a crossover from male to female predominance in case-patients occurred (P< 0·0001). Overall, compared to 1995–2004, rates were higher in 2005–2008 (RR 2·92, 95% CI 2·08–4·09) and 2009–2012 (RR 2·66, 95% CI 1·90–3·73). However, rate changes from 2005–2008 to 2009–2012 varied by age group (Pinteraction< 0·0001): 0–14 years (RR 0·55, 95% CI 0·42–0·71), 15–44 years (RR 0·99, 95% CI 0·82–1·19), 45–64 years (RR 1·47, 95% CI 1·21–1·79) and ⩾65 years (RR 2·18, 95% CI 1·46–3·25). The evolving epidemiology of cryptosporidiosis necessitates further identification of risk factors in population subgroups. Adding systematic molecular typing ofCryptosporidiumspecimens to US national cryptosporidiosis surveillance would help further identify risk factors and markedly expand understanding of cryptosporidiosis epidemiology in the United States.


Author(s):  
Yanlei Wang ◽  
Shuang Xu ◽  
Xiang Liu

Train accidents damage infrastructure and rolling stock, disrupt operations, and may result in casualties and environmental damage. While the majority of previous studies focused on the safety risks associated with train derailments or highway-rail grade crossing collisions, much less work has been undertaken to evaluate train collision risk. This paper develops a statistical risk analysis methodology for freight-train collisions in the United States between 2000 and 2014. Negative binomial regression models are developed to estimate the frequency of freight-train collisions as a function of year and traffic volume by accident cause. Train collision severity, measured by the average number of railcars derailed, varied with accident cause. Train collision risk, defined as the product of collision frequency and severity, is predicted for 2015 to 2017, based on the 2000 to 2014 safety trend. The statistical procedures developed in this paper can be adapted to various other types of consequences, such as damage costs or casualties. Ultimately, this paper and its sequent studies aim to provide the railroad industry with data analytic tools to discover useful information from historical accidents so as to make risk-informed safety decisions.


Sign in / Sign up

Export Citation Format

Share Document