scholarly journals Positive drug test trends in fatally-injured drivers in the United States from 2007 to 2017

Author(s):  
Sunday Azagba ◽  
Keely Latham ◽  
Lingpeng Shan ◽  
Fares Qeadan

Abstract Background The last two decades have seen tremendous changes in the U.S. environment surrounding drugs. Driving under the influence of drugs is a growing public health hazard. The present study examined trends in drug involvement in fatally-injured drivers in the U.S. Methods Data were drawn from the 2007–2017 Fatality Analysis Reporting System. Cochran–Armitage tests were performed to assess the statistical significance of changes in the yearly prevalence of positive drug tests in fatally-injured drivers over time. In addition, analyses were stratified by sex, race, and age. Results The yearly prevalence of positive drug tests in fatally-injured drivers increased significantly from 20.7% in 2007 to 30.7% in 2017, with results showing a higher prevalence among males, those aged 21–44, and Whites. The gap between Blacks and Whites narrowed in 2017. There was a decline in the yearly prevalence in all age groups between 2016 and 2017, although the decrease in the 21–44 age group was much smaller than other age groups. Among drivers who tested positive for drugs, 34.6% had a blood alcohol concentration (BAC) above the threshold of per se evidence for impaired driving, and 63% had a BAC below the threshold. Conclusions Our results indicate that the overall yearly prevalence of fatally-injured drivers who tested positive for drugs increased significantly from 2007 to 2017, with similar results found for subgroups. Findings further highlight that drugged driving remains a public health priority, and more action is needed to stem this disturbing trend.

1994 ◽  
Vol 34 (3) ◽  
pp. 265-270 ◽  
Author(s):  
A W Jones

This article describes a drink-driving scenario where a woman was apprehended for driving under the influence (DUI) with a blood alcohol concentration (BAC) of 256mg/dl1 The correctness of this result was vigorously challenged by a medical expert witness for the defence, who was actually a specialist in alcohol diseases. Despite reanalysis to confirm the BAC as well as a DNA profile to prove the identity of the blood specimen, the woman was acquitted of the charge of drunk driving by the lower court. However, she was subsequently found guilty in the High Court of Appeals with a unanimous decision and sentenced to four weeks imprisonment. This case report illustrates some of the problems surrounding the use of expert medical evidence by the defence to challenge the validity of the prosecution evidence based solely on a suspect's BAC. In situations such as these, an expert witness should be called by the prosecution to clarify and, if necessary, rebut medical and/or scientific opinions that might mislead the court and influence the outcome of the trial.


Author(s):  
Janice Arceneaux ◽  
James Dickens ◽  
Wanza Bacon

Established in 1889, the United States Public Health Service Commissioned Corps (Corps) is one of the seven uniformed services and is part of the U.S. Department of Health and Human Services. The Corps is committed to protecting, promoting and advancing the health and safety of the nation with a history that dates back over two centuries, beginning as the U.S. Marine Hospital Service. Today, the Corps responds and serves in many areas impacted by natural disasters, disease outbreaks, terrorist attacks and public health emergencies. Corps officers have deployed to provide assistance during national public health emergencies (e.g., hurricanes, bombings, flooding and wild fires); to combat the Ebola epidemic in West Africa; and to provide humanitarian assistance in Latin America and the Caribbean. Corps deployments impact not only service members but also their families. This article offers a brief overview of the Corps and discusses how deployments impact families. Family resiliency and future implications for research and practice will also be examined.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
James L Crooks ◽  
Wayne Cascio ◽  
Madelyn Percy ◽  
Jeanette Reyes ◽  
Lucas Neas ◽  
...  

Introduction: Extreme weather events such as dust storms are predicted to become more frequent as the global climate warms through the 21st century. Studies of Asian, Saharan, Arabian, and Australian dust storms have found associations with cardiovascular and total non-accidental mortality and hospitalizations for stroke. However, the only population-level epidemiological work on dust storms in the United States was focused on a single small metropolitan area (Spokane, WA), and it is uncertain whether its null results are representative of the country as a whole. Hypothesis: Dust storms in the United States are associated with daily cardiovascular mortality. Methods: Dust storm incidence data (N=141), including date and approximate location, as well as meteorological station observations, were taken from the U.S. National Weather Service. County-level mortality data for the years 1993-2005 were acquired from the National Center for Health Statistics. Ambient particulate matter monitor concentrations were obtained from the U.S. Environmental Protection Agency. Inference was performed used conditional logistic regression models under a case-crossover design while accounting for the nonlinear effect of temperature. Results: We found a 9.5% increase in cardiovascular mortality at a two-day lag (95% CI: [0.31%,19.5%], p = 0.042). The results were robust to adjusting for heat waves and ambient particulate matter concentrations. Analysis of storms occurring only on days with <0.1 inches of precipitation strengthened these results and in addition yielded a mean daily increase of 4.0% across lags 0-5 (95% CI: [0.07%,20.8%], p = 0.046). In Arizona, the U.S. state with the largest number of storms, we observed a 13.0% increase at a three-day lag (CI: [0.40%,27.1%], p = 0.043). Conclusions: Dust storms in the U.S. are associated with increases in lagged cardiovascular mortality. This has implications for the development of public health advisories and suggests that further public health interventions may be needed. Disclaimer: This work does not represent official U.S. Environmental Protection Agency policy.


2020 ◽  
Vol 4 (Supplement_1) ◽  
Author(s):  
Robert Alan Vigersky ◽  
Michael Stone ◽  
Pratik Agrawal ◽  
Alex Zhong ◽  
Kevin Velado ◽  
...  

Abstract Introduction: The MiniMed™ 670G system was FDA-approved in 2016 for adults and adolescents ≥14yrs, and in 2018 for children ages 7-13yrs with T1D. Since then, use of the system has grown to over 180,000 people in the U.S. The glycemic control benefits of real-world MiniMed™ 670G system Auto Mode use in the U.S. were assessed. Methods: System data (aggregated five-minute instances of sensor glucose [SG]) uploaded from March 2017 to July 2019 by individuals (N=118,737) with T1D and ≥7yrs of age who enabled Auto Mode were analyzed to determine the mean % of overall time spent &lt;54mg/dL/&lt;70mg/dL (TBR); between 70-180mg/dL (TIR); and &gt;180mg/dL/&gt;250mg/dL (TAR). The impact of Auto Mode was further assessed in a sub-group of individuals (N=51,254) with, at least, 7 days of SG data for both Auto Mode turned ON and turned OFF. The % of TIR, TBR and TAR, and the associated glucose management indicator (GMI) were evaluated for the overall OFF (2,524,570 days) and ON (6,308,806 days) periods, and across different age groups. Results: System data TIR was 71.3%; TBR was 0.4% and 1.9%, respectively; and TAR was 26.8% and 6.2%, respectively. User-wise data of Auto Mode OFF versus ON showed a mean of 70.3% of the time spent in Auto Mode, that TIR increased from 60.9% to 69.9%; and that both TBR and TAR decreased. For those 7-13yrs (N=1,417), TIR increased from 48.7% to 61.5%; TBR increased from 0.5% to 0.6% and from 2.0% to 2.2%, respectively; and TAR decreased from 49.3% to 36.3% and from 20.5% to 13.0%, respectively. For those 14-21yrs (N=4,194), TIR increased from 51.0% to 61.5%; TBR decreased from 0.7% to 0.6% and from 2.3% to 2.0%, respectively; and TAR decreased from 46.7% to 36.5% and from 18.5% to 12.5%, respectively. For those ≥22yrs (N=45,643), TIR increased from 62.2% to 70.9%; TBR decreased from 0.7% to 0.5% and from 2.6% to 1.9%, respectively; and TAR decreased from 35.2% to 27.3% and from 9.9% to 6.3%, respectively. The mean GMI decreased by 0.23% (overall), 0.48% (7-13yrs), 0.35% (14-21yrs), and 0.22% (≥22yrs), respectively, with Auto Mode ON versus OFF. Discussion: In over 6 million days of real-world MiniMed™ 670G system Auto Mode use in the U.S., TIR of a large pediatric and adult population with T1D improved by 9% compared to when Auto Mode was OFF, which was comparable to or exceeded the TIR observed in the smaller pivotal trials. These results further support outcomes of the pivotal trials and increased glycemic control with system use.


1974 ◽  
Vol 20 (2) ◽  
pp. 126-140 ◽  
Author(s):  
M F Mason ◽  
K M Dubowski

Abstract We give a résumé of "chemical testing" for alcohol in the United States in connection with traffic-law enforcement. Recent procedural and instrumental developments are briefly reviewed. Various factors involved in discrepancies between the results of analyses of near-simultaneous venous blood and breath specimens from the same subject are examined. Because the causes of these discrepancies cannot adequately be controlled in law-enforcement practice, we suggest that calculation of a blood-alcohol concentration based on the result of a breath analysis be abandoned. We recommend that when breath analysis is performed for law-enforcement purposes, the interpretation of the result should be statutorily based on the amount of alcohol found per unit volume of alveolar ("deep-lung") air. Serum or plasma of capillary blood is recommended as the sample when blood is to be analyzed.


2013 ◽  
Vol 76 (2) ◽  
pp. 302-306 ◽  
Author(s):  
STEVEN M. GENDEL ◽  
NAZLEEN KHAN ◽  
MONALI YAJNIK

Despite awareness of the importance of food allergy as a public health issue, recalls and adverse reactions linked to undeclared allergens in foods continue to occur with high frequency. To reduce the overall incidence of such problems and to ensure that food-allergic consumers have the information they need to prevent adverse reactions, it is important to understand which allergen control practices are currently used by the food industry. Therefore, the U.S. Food and Drug Administration carried out directed inspections of registered food facilities in 2010 to obtain a broader understanding of industry allergen control practices in the United States. The results of these inspections show that allergen awareness and the use of allergen controls have increased greatly in the last decade, but that small facilities lag in implementing allergen controls.


Author(s):  
Richard J. Gelting ◽  
Steven C. Chapra ◽  
Paul E. Nevin ◽  
David E. Harvey ◽  
David M. Gute

Public health has always been, and remains, an interdisciplinary field, and engineering was closely aligned with public health for many years. Indeed, the branch of engineering that has been known at various times as sanitary engineering, public health engineering, or environmental engineering was integral to the emergence of public health as a distinct discipline. However, in the United States (U.S.) during the 20th century, the academic preparation and practice of this branch of engineering became largely separated from public health. Various factors contributed to this separation, including an evolution in leadership roles within public health; increasing specialization within public health; and the emerging environmental movement, which led to the creation of the U.S. Environmental Protection Agency (EPA), with its emphasis on the natural environment. In this paper, we consider these factors in turn. We also present a case study example of public health engineering in current practice in the U.S. that has had large-scale positive health impacts through improving water and sanitation services in Native American and Alaska Native communities. We also consider briefly how to educate engineers to work in public health in the modern world, and the benefits and challenges associated with that process. We close by discussing the global implications of public health engineering and the need to re-integrate engineering into public health practice and strengthen the connection between the two fields.


2020 ◽  
Vol 55 (5) ◽  
pp. 564-570
Author(s):  
Cheryl J Cherpitel ◽  
Edwina Williams ◽  
Yu Ye ◽  
William C Kerr

Abstract Aims To analyze racial/ethnic disparities in risk of two alcohol-related events, alcohol-related injury and self-reported perceived driving under the influence (DUI) from hours of exposure to an elevated blood alcohol concentration (BAC). Methods Risk curves for the predicted probability of these two outcomes from the number of hours of exposure to a BAC ≥ 0.08 mg% in the past year were analyzed separately for whites, blacks and Hispanics in a merged sample of respondents from four US National Alcohol Surveys (2000–2015). Results Hours of exposure to a BAC ≥ 0.08 showed a stronger association with perceived DUI than with alcohol-related injury for all racial/ethnic groups. Greater risk was found for whites than blacks or Hispanics for outcomes at nearly all BAC exposure levels, and most marked at the highest level of exposure. Risk of both outcomes was significant for whites at all exposure levels, but small for alcohol-related injury. Little association was found for alcohol-related injury for blacks or Hispanics. For perceived DUI, risk for blacks was significantly elevated at lower levels of exposure, while risk for Hispanics was significantly elevated beginning at 30 h of exposure. Conclusions Findings showed racial/ethnic differences in risk of alcohol-related injury and perceived DUI from hours of exposure to elevated BAC. Risk increased at relatively low levels of exposure to a BAC ≥ 0.08, especially for whites, highlighting the importance of preventive efforts to reduce harmful outcomes for moderate drinkers.


Sign in / Sign up

Export Citation Format

Share Document