scholarly journals Corrosion control using hydroxide and bicarbonate alkalising agents in water drinking processes

2015 ◽  
Vol 8 (1) ◽  
pp. 53-76 ◽  
Author(s):  
P. Torres-Lozada ◽  
K. A. Bueno-Zabala ◽  
L. G. Delgado-Cabrera ◽  
L. E. Barba-Ho ◽  
C. H. Cruz-Vélez

<p><strong>Abstract.</strong> The water supply industry is faced with three phenomena of great importance: the aggressiveness, corrosion, and incrustation of water distribution systems (WDS), which are primarily due to the low alkalinity of water sources and the addition of chemicals used in water treatment processes, which require pH adjustments during the last stage of the treatment process before going into the WDS. This article presents the results of using Ca(OH)<sub>2</sub> and NaOH with doses between 2 and 20 mg L<sup>−1</sup> and NaHCO<sub>3</sub> and Na<sub>2</sub>CO<sub>3</sub> between 10 and 250 mg L<sup>−1</sup> to adjust the pH of water treated from the Cauca River, which is located in Cali, Colombia, using stabilisation indices normally used in water treatment plants for pH monitoring processes and to better predict the behaviour of water in the WDS. The results indicate that for the case of the surface water source studied, which exhibits low alkalinity levels, the evaluated alkalising agents, with the exception of NaHCO<sub>3</sub>, can create conditions that lead to the precipitation of a~protective calcium carbonate film. Because the pH values that guarantee an adequate pH adjustment are higher (8.7–9.0) than those specified by the Colombian water code and because other international rules indicate that these values do not compromise the health of consumers, it is advisable to review and adjust the code in this respect.</p>

2021 ◽  
Vol 1 (1) ◽  
Author(s):  
Sybil Derrible ◽  
Thanh T. M. Truong ◽  
Hung T. Pham ◽  
Quan H. Nguyen

AbstractIn many countries, water distribution systems consist of large, highly pressurized pipe networks that require an excessive amount of energy and that are vulnerable to large-scale contamination. To imagine the future of water distribution, we can learn from Hanoi, Vietnam, where water is distributed at low pressures and most buildings are equipped with a basement tank, a rooftop tank, and separate water treatment processes, resulting in a system that consumes less energy and that is more resilient.


2002 ◽  
Vol 2 (4) ◽  
pp. 97-104 ◽  
Author(s):  
S. Okabe ◽  
T. Kokazi ◽  
Y. Watanabe

When biodegradable organic matter and other nutrients, such as ammonia and phosphorus, are not sufficiently removed during water treatment, bacteria may proliferate in the water distribution system. Bacterial regrowth deteriorates water quality (taste and odor), accelerates corrosion, and potentially increases the risk of microbial diseases. Therefore, this research was conducted to evaluate the impact of four different advanced water treatment processes, including biological treatments such as a rotating biofilm membrane reactor (RBMR) and a biological activated carbon (BAC) filter and ultrafiltration (UF), on reduction of nutrient levels and biofilm formation potentials of the treated water entering model distribution systems (annular reactors). Our results revealed that biological treatments significantly improved the “biostability” of water leaving from the treatment plant. On average, The RBMR and BAC filter reduced easily assimilable organic carbon (AOC) concentration by half when compared with conventional treatment (multi-media filtration; MF) and ultrafiltration (from 35-49 to 18-23 mg C L-1). Consequently, biofilm formation potential was reduced by a factor of 5 to 10 (from 3,200-5,100 to 490-710 pg ATP cm-2). With respect to “biostability” of water, ultrafiltration was less effective in reducing AOC concentrations. In addition, the impact of chlorine disinfection on biofilm accumulation and AOC levels in the distribution system were studied.


2015 ◽  
Vol 20 (24) ◽  
Author(s):  
B Guzman-Herrador ◽  
A Carlander ◽  
S Ethelberg ◽  
B Freiesleben de Blasio ◽  
M Kuusi ◽  
...  

A total of 175 waterborne outbreaks affecting 85,995 individuals were notified to the national outbreak surveillance systems in Denmark, Finland and Norway from 1998 to 2012, and in Sweden from 1998 to 2011. Between 4 and 18 outbreaks were reported each year during this period. Outbreaks occurred throughout the countries in all seasons, but were most common (n = 75/169, 44%) between June and August. Viruses belonging to the Caliciviridae family and Campylobacter were the pathogens most frequently involved, comprising n = 51 (41%) and n = 36 (29%) of all 123 outbreaks with known aetiology respectively. Although only a few outbreaks were caused by parasites (Giardia and/or Cryptosporidium), they accounted for the largest outbreaks reported during the study period, affecting up to 53,000 persons. Most outbreaks, 124 (76%) of those with a known water source (n = 163) were linked to groundwater. A large proportion of the outbreaks (n = 130/170, 76%) affected a small number of people (less than 100 per outbreak) and were linked to single-household water supplies. However, in 11 (6%) of the outbreaks, more than 1,000 people became ill. Although outbreaks of this size are rare, they highlight the need for increased awareness, particularly of parasites, correct water treatment regimens, and vigilant management and maintenance of the water supply and distribution systems.


2012 ◽  
Vol 12 (5) ◽  
pp. 580-587 ◽  
Author(s):  
Stephen Mounce ◽  
John Machell ◽  
Joby Boxall

Safe, clean drinking water is a foundation of society and water quality monitoring can contribute to ensuring this. A case study application of the CANARY software to historic data from a UK drinking water distribution system is described. Sensitivity studies explored appropriate choice of algorithmic parameter settings for a baseline site, performance was evaluated with artificial events and the system then transferred to all sites. Results are presented for analysis of nine water quality sensors measuring six parameters and deployed in three connected district meter areas (DMAs), fed from a single water source (service reservoir), for a 1 year period and evaluated using comprehensive water utility records with 86% of event clusters successfully correlated to causes (spatially limited to DMA level). False negatives, defined by temporal clusters of water quality complaints in the pilot area not corresponding to detections, were only approximately 25%. It was demonstrated that the software could be configured and applied retrospectively (with potential for future near real time application) to detect various water quality event types (with a wider remit than contamination alone) for further interpretation.


Energies ◽  
2020 ◽  
Vol 13 (23) ◽  
pp. 6221
Author(s):  
Jedrzej Bylka ◽  
Tomasz Mróz

The water supply system is one of the most important elements in a city. Currently, many cities struggle with a water deficit problem. Water is a commonly available resource and constitutes the majority of land cover; however, its quality, in many cases, makes it impossible to use as drinking water. To treat and distribute water, it is necessary to supply a certain amount of energy to the system. An important goal of water utility operators is to assess the energy efficiency of the processes and components. Energy assessments are usually limited to the calculation of energy dissipation (sometimes called “energy loss”). From a physical point of view, the formulation of “energy loss” is incorrect; energy in water transport systems is not consumed but only transformed (dissipated) into other, less usable forms. In the water supply process, the quality of energy—exergy (ability to convert into another form)—is consumed; hence, a new evaluation approach is needed. The motivation for this study was the fact that there are no tools for exergy evaluation of water distribution systems. A model of the exergy balances for a water distribution system was proposed, which was tested for the selected case studies of a water supply system and a water treatment station. The tool developed allows us to identify the places with the highest exergy destructions. In the analysed case studies, the highest exergy destruction results from excess pressure (3939 kWh in a water supply system and 1082 kWh in a water treatment plant). The exergy analysis is more accurate for assessing the system compared to the commonly used energy-based methods. The result can be used for assessing and planning water supply system modernisation.


2016 ◽  
Vol 79 (6) ◽  
pp. 1021-1025 ◽  
Author(s):  
MARY THERESA CALLAHAN ◽  
SASHA C. MARINE ◽  
KATHRYNE L. EVERTS ◽  
SHIRLEY A. MICALLEF

ABSTRACT Irrigation water distribution systems are used to supply water to produce crops, but the system may also provide a protected environment for the growth of human pathogens present in irrigation water. In this study, the effects of drip tape installation depth and sanitization on the microbial quality of irrigation groundwater were evaluated. Drip tape lines were installed on the soil surface or 5 or 10 cm below the soil surface. Water samples were collected from the irrigation source and the end of each drip line every 2 weeks over an 11-week period, and the levels of Escherichia coli, total coliforms, aerobic mesophilic bacteria, and enterococci were quantified. Half of the lines installed at each depth were flushed with sodium hypochlorite for 1 h during week 6 to achieve a residual of 10 ppm at the end of the line. There was a statistically significant (P =0.01) effect of drip tape installation depth and sanitizer application on the recovery of E. coli, with increased levels measured at the 5-cm depth and in nonsanitized lines, although the levels were at the limit of detection, potentially confounding the results. There was no significant effect of drip tape depth on total coliforms, aerobic mesophiles, or enterococci. In contrast, a statistically significant increase (P &lt; 0.01) in the recovery of total coliforms was recorded from the ends of lines that received chlorine. This may be indicative of shedding of cells owing to degradation of biofilms that formed on the inner walls of the lines. These findings emphasize the need to better understand conditions that may lead to corrosion and increases in bacterial loads inside drip lines during flushing. Recommendations to growers should suggest collecting groundwater samples for testing at the end of drip lines rather than at the source. Guidelines on flushing drip lines with chlorine may need to include water pH monitoring, a parameter that influences the corrosive properties of chlorine.


Biofilms ◽  
2005 ◽  
Vol 2 (3) ◽  
pp. 197-227 ◽  
Author(s):  
R. T. Bachmann ◽  
R. G. J. Edyvean

Biofouling in water distribution systems has, arguably, affected our lives for more than 3500 years. It may be defined as the undesirable accumulation of biotic matter on a surface, which can cause odour and taste problems, the deterioration of pipe materials and fittings and result in the discoloration of water. Early efforts to combat these problems included the use of sedimentation tanks, disinfection by silver ionization and cleaning of the distribution network. At the turn of the nineteenth century, rapid sand filtration and water disinfection became widely used and helped to reduce the organic and bacterial load in drinking water. A better understanding of the role and causes of biofouling in water distribution systems resulted in various legislations, which in turn have been a driving factor for improving or developing new water treatment methods, pipe materials, analytical techniques, etc. However, increasing requirements on water quality in the late twentieth century made water treatment and specific anti-corrosion and/or microbial control regimens insufficient as a means of solving the transportation problem owing to the heterogeneity of pipe materials and contamination from outside the distribution system. Furthermore, as drinking water passes through the mains it undergoes a series of quality changes owing to interactions with the pipe walls, bacteria and the sediment phase.This review emphasizes the extent to which biofouling depends on interactions between microorganisms and (1) nutrients, (2) environmental conditions (temperature), (3) physicochemical processes (sedimentation, corrosion, disinfection) and (4) pipe material. A good knowledge of these complex interactions is necessary for implementing a successful biofouling control strategy.


Sign in / Sign up

Export Citation Format

Share Document