Reactive Biomaterial for the Treatment of Herbicide Contaminated Drinking Water: Atrazine Dechlorination

Author(s):  
Eduardo Reátegui ◽  
Erik Reynolds ◽  
Lisa Kasinkas ◽  
Amit Aggarwal ◽  
Michael J. Sadowsky ◽  
...  

The herbicide atrazine is used for control of broadleaf weeds, principally in corn, sorghum, and sugarcane [1]. Atrazine is currently used in 70 countries at an estimated annual rate of 111,000 tons [2, 3]. Atrazine is typically applied early in the planting season. However, Heavy rainfall events, shortly after application may lead to detectable atrazine concentrations in waterways and in drinking-water supplies. The United States Environmental Protection Agency established a 3 ppb limit of atrazine in drinking water. In some instances, municipal water treatment plants use chemicals and other treatment processes, such as activated carbon, to reduce atrazine to below the 3 ppb legal limit for drinking water.

2020 ◽  
Vol 6 (3) ◽  
Author(s):  
William A. Horn ◽  
Joshua D. Beard

The Michigan Department of Environment, Great Lakes, and Energy (“EGLE”), formerly the Michigan Department of Environmental Quality, is in the process of seeking primary enforcement responsibility from the United States Environmental Protection Agency (“EPA”) for its Underground Injection Control (“UIC”) program for Class II wells pursuant to Part C of the Safe Drinking Water Act (“SDWA”).


1985 ◽  
Vol 17 (4-5) ◽  
pp. 689-700 ◽  
Author(s):  
Elmer W. Akin

Health concerns regarding waterborne transmission of enteric viruses began to develop around 1940 in the United States (U.S.) with the isolation of poliovirus from human feces and sewage. The implication of these isolations for the transmission of viral disease through contaminated drinking water stimulated research on methodology for virus detection, recovery and assessment from water. Although virus methods research is still an important area of study, relatively sensitive procedures became available during the past decade for recovering many enteric virus types from large-volume samples of drinking water. Controversy surrounded many of the early reported isolations of viruses from treated drinking water using these procedures due to the suspicion of laboratory contamination. The occurrence of viruses in drinking water treated by currently accepted procedures has still not been proven by the U.S. experience although the likelihood may be gaining support. However, a virus survey of 54 water supplies and extensive studies of two water systems by the U.S. Environmental Protection Agency did not demonstrate viral contamination of treated water derived from surface sources.


2015 ◽  
Vol 15 (4) ◽  
pp. 766-772 ◽  
Author(s):  
Piyawan Leechart ◽  
Duangrat Inthorn ◽  
Paitip Thiravetyan

Nowadays polyethylene terephthalate (PET) bottles are commonly used as food containers as they are lightweight. PET bottles contain antimony (Sb) and phthalate compounds. In contact with food, antimony and phthalate molecules could migrate from the inner surface of a PET bottle to the food. Therefore, we studied the effect of NaCl concentration in PET bottles on the leakage of antimony and phthalates. It was found that the concentration of antimony leached into the solution was about 6 ngl−1 after 5 days storage at room temperature in the absence of NaCl. Increasing NaCl concentrations to 6% caused a decrease in the amount of soluble antimony in the solution to 2 ngl−1 under the same conditions. In addition, the maximum leakage of phthalate compounds of about 130 ngl−1 occurred after 35 days of storage at 60 °C in 0.1% NaCl. It was found that the leakage of phthalate compounds decreased at higher NaCl concentrations (NaCl 0.5%–30%). Higher NaCl concentrations led to a decrease in the migration of antimony and phthalate compounds into the solution. This might be due to the fact that antimony and phthalate compounds form complexes with NaCl. However, the leakage of these compounds was lower than the standard guidelines of the United States Environmental Protection Agency for drinking water.


Author(s):  
J. R. Millette ◽  
R. S. Brown

The United States Environmental Protection Agency (EPA) has labeled as “friable” those building materials that are likely to readily release fibers. Friable materials when dry, can easily be crumbled, pulverized, or reduced to powder using hand pressure. Other asbestos containing building materials (ACBM) where the asbestos fibers are in a matrix of cement or bituminous or resinous binders are considered non-friable. However, when subjected to sanding, grinding, cutting or other forms of abrasion, these non-friable materials are to be treated as friable asbestos material. There has been a hypothesis that all raw asbestos fibers are encapsulated in solvents and binders and are not released as individual fibers if the material is cut or abraded. Examination of a number of different types of non-friable materials under the SEM show that after cutting or abrasion, tuffs or bundles of fibers are evident on the surfaces of the materials. When these tuffs or bundles are examined, they are shown to contain asbestos fibers which are free from binder material. These free fibers may be released into the air upon further cutting or abrasion.


1989 ◽  
Vol 21 (6-7) ◽  
pp. 685-698
Author(s):  
J. J. Convery ◽  
J. F. Kreissl ◽  
A. D. Venosa ◽  
J. H. Bender ◽  
D. J. Lussier

Technology transfer is an important activity within the ll.S. Environmental Protection Agency. Specific technology transfer programs such as the activities of the Center for Environmental Research Information, the Innovative and Alternative Technology Program, as well as the Small Community Outreach Program are used to encourage the utilization of cost-effective municipal pollution control technology. Case studies of three technologies including a plant operations diagnostic/remediation methodology, alternative sewer technologies and ultraviolet disinfection are presented. These case studies are presented retrospectively in the context of a generalized concept of how technology flows from science to utilization which was developed in a study by Allen (1977). Additional insights from this study are presented on the information gathering characteristics of engineers and scientists which may be useful in designing technology transfer programs. The recognition of the need for a technology or a deficiency in current practice are important stimuli other than technology transfer for accelerating the utilization of new technology.


2015 ◽  
Vol 14 (2) ◽  
pp. 223-235 ◽  
Author(s):  
Katherine Phetxumphou ◽  
Siddhartha Roy ◽  
Brenda M. Davy ◽  
Paul A. Estabrooks ◽  
Wen You ◽  
...  

The United States Environmental Protection Agency mandates that community water systems (CWSs), or drinking water utilities, provide annual consumer confidence reports (CCRs) reporting on water quality, compliance with regulations, source water, and consumer education. While certain report formats are prescribed, there are no criteria ensuring that consumers understand messages in these reports. To assess clarity of message, trained raters evaluated a national sample of 30 CCRs using the Centers for Disease Control Clear Communication Index (Index) indices: (1) Main Message/Call to Action; (2) Language; (3) Information Design; (4) State of the Science; (5) Behavioral Recommendations; (6) Numbers; and (7) Risk. Communication materials are considered qualifying if they achieve a 90% Index score. Overall mean score across CCRs was 50 ± 14% and none scored 90% or higher. CCRs did not differ significantly by water system size. State of the Science (3 ± 15%) and Behavioral Recommendations (77 ± 36%) indices were the lowest and highest, respectively. Only 63% of CCRs explicitly stated if the water was safe to drink according to federal and state standards and regulations. None of the CCRs had passing Index scores, signaling that CWSs are not effectively communicating with their consumers; thus, the Index can serve as an evaluation tool for CCR effectiveness and a guide to improve water quality communications.


2006 ◽  
Vol 4 (S2) ◽  
pp. 201-240 ◽  
Author(s):  
Michael Messner ◽  
Susan Shaw ◽  
Stig Regli ◽  
Ken Rotert ◽  
Valerie Blank ◽  
...  

In this paper, the US Environmental Protection Agency (EPA) presents an approach and a national estimate of drinking water related endemic acute gastrointestinal illness (AGI) that uses information from epidemiologic studies. There have been a limited number of epidemiologic studies that have measured waterborne disease occurrence in the United States. For this analysis, we assume that certain unknown incidence of AGI in each public drinking water system is due to drinking water and that a statistical distribution of the different incidence rates for the population served by each system can be estimated to inform a mean national estimate of AGI illness due to drinking water. Data from public water systems suggest that the incidence rate of AGI due to drinking water may vary by several orders of magnitude. In addition, data from epidemiologic studies show AGI incidence due to drinking water ranging from essentially none (or less than the study detection level) to a rate of 0.26 cases per person-year. Considering these two perspectives collectively, and associated uncertainties, EPA has developed an analytical approach and model for generating a national estimate of annual AGI illness due to drinking water. EPA developed a national estimate of waterborne disease to address, in part, the 1996 Safe Drinking Water Act Amendments. The national estimate uses best available science, but also recognizes gaps in the data to support some of the model assumptions and uncertainties in the estimate. Based on the model presented, EPA estimates a mean incidence of AGI attributable to drinking water of 0.06 cases per year (with a 95% credible interval of 0.02–0.12). The mean estimate represents approximately 8.5% of cases of AGI illness due to all causes among the population served by community water systems. The estimated incidence translates to 16.4 million cases/year among the same population. The estimate illustrates the potential usefulness and challenges of the approach, and provides a focus for discussions of data needs and future study designs. Areas of major uncertainty that currently limit the usefulness of the approach are discussed in the context of the estimate analysis.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
James L Crooks ◽  
Wayne Cascio ◽  
Madelyn Percy ◽  
Jeanette Reyes ◽  
Lucas Neas ◽  
...  

Introduction: Extreme weather events such as dust storms are predicted to become more frequent as the global climate warms through the 21st century. Studies of Asian, Saharan, Arabian, and Australian dust storms have found associations with cardiovascular and total non-accidental mortality and hospitalizations for stroke. However, the only population-level epidemiological work on dust storms in the United States was focused on a single small metropolitan area (Spokane, WA), and it is uncertain whether its null results are representative of the country as a whole. Hypothesis: Dust storms in the United States are associated with daily cardiovascular mortality. Methods: Dust storm incidence data (N=141), including date and approximate location, as well as meteorological station observations, were taken from the U.S. National Weather Service. County-level mortality data for the years 1993-2005 were acquired from the National Center for Health Statistics. Ambient particulate matter monitor concentrations were obtained from the U.S. Environmental Protection Agency. Inference was performed used conditional logistic regression models under a case-crossover design while accounting for the nonlinear effect of temperature. Results: We found a 9.5% increase in cardiovascular mortality at a two-day lag (95% CI: [0.31%,19.5%], p = 0.042). The results were robust to adjusting for heat waves and ambient particulate matter concentrations. Analysis of storms occurring only on days with <0.1 inches of precipitation strengthened these results and in addition yielded a mean daily increase of 4.0% across lags 0-5 (95% CI: [0.07%,20.8%], p = 0.046). In Arizona, the U.S. state with the largest number of storms, we observed a 13.0% increase at a three-day lag (CI: [0.40%,27.1%], p = 0.043). Conclusions: Dust storms in the U.S. are associated with increases in lagged cardiovascular mortality. This has implications for the development of public health advisories and suggests that further public health interventions may be needed. Disclaimer: This work does not represent official U.S. Environmental Protection Agency policy.


Sign in / Sign up

Export Citation Format

Share Document