scholarly journals Simulating sediment discharge at water treatment plants under different land use scenarios using cascade modelling with an expert-based erosion-runoff model and a deep neural network

2021 ◽  
Vol 25 (12) ◽  
pp. 6223-6238
Author(s):  
Edouard Patault ◽  
Valentin Landemaine ◽  
Jérôme Ledun ◽  
Arnaud Soulignac ◽  
Matthieu Fournier ◽  
...  

Abstract. Excessive sediment discharge in karstic regions can be highly disruptive to water treatment plants. It is essential for catchment stakeholders and drinking water suppliers to limit the impact of high sediment loads on potable water supply, but their strategic choices must be based on simulations integrating surface and groundwater transfers and taking into account possible changes in land use. Karstic environments are particularly challenging as they face a lack of accurate physical descriptions for the modelling process, and they can be particularly complex to predict due to the non-linearity of the processes generating sediment discharge. The aim of the study was to assess the sediment discharge variability at a water treatment plant according to multiple realistic land use scenarios. To reach that goal, we developed a new cascade modelling approach with an erosion-runoff geographic information system (GIS) model (WaterSed) and a deep neural network. The model was used in the Radicatel hydrogeological catchment (106 km2 in Normandy, France), where karstic spring water is extracted to a water treatment plant. The sediment discharge was simulated for five design storms under current land use and compared to four land use scenarios (baseline, ploughing up of grassland, eco-engineering, best farming practices, and coupling of eco-engineering/best farming practices). Daily rainfall time series and WaterSed modelling outputs extracted at connected sinkholes (positive dye tracing) were used as input data for the deep neural network model. The model structure was found by a classical trial-and-error procedure, and the model was trained on 2 significant hydrologic years. Evaluation on a test set showed a good performance of the model (NSE = 0.82), and the application of a monthly backward-chaining nested cross-validation revealed that the model is able to generalize on new datasets. Simulations made for the four land use scenarios suggested that ploughing up 33 % of grasslands would increase sediment discharge at the water treatment plant by 5 % on average. By contrast, eco-engineering and best farming practices will significantly reduce sediment discharge at the water treatment plant (respectively in the ranges of 10 %–44 % and 24 %–61 %). The coupling of these two strategies is the most efficient since it affects the hydro-sedimentary production and transfer processes (decreasing sediment discharge from 40 % to 80 %). The cascade modelling approach developed in this study offers interesting opportunities for sediment discharge prediction at karstic springs or water treatment plants under multiple land use scenarios. It also provides robust decision-making tools for land use planning and drinking water suppliers.

2020 ◽  
Author(s):  
Edouard Patault ◽  
Valentin Landemaine ◽  
Jérôme Ledun ◽  
Arnaud Soulignac ◽  
Matthieu Fournier ◽  
...  

Abstract. Excessive sediment discharge at karstic springs and thus, water treatment plants, can be highly disruptive. It is essential for catchment stakeholders and drinking water supplier to reduce the impact of sediment on potable water supply, but their strategic choices must be based on simulations, integrating surface and groundwater transfers, and taking into account possible changes in land use. Karstic environments are particularly challenging as they face a lack of accurate physical description for the modelling process, and they can be seen as a black-box due to the non-linearity of the processes generating sediment discharge. The aim of the study was to assess the sediment discharge variability at a water treatment plant according to multiple realistic land use scenarios. To reach that goal, we developed a new coupled modelling approach with an erosion-runoff GIS model (WaterSed) and a deep neural network. The model was used in the Radicatel catchment (106 km2 in Normandy, France) where karstic spring water is extracted to a water treatment plant. The sediment discharge was simulated for five designed storm projects under current land use and compared to three land use scenarios (baseline, ploughing up of grassland, eco-engineering, best farming practices). Daily rainfall time series and WaterSed modelling outputs extracted at connected sinkholes were used as input data for the deep neural network model. The model structure was found by a classical trial and error procedure, and the model was trained on two significant hydrologic years. Evaluation on a test set showed a good performance of the model (NSE = 0.82), and the application of a monthly-backward chaining nested cross validation revealed that the model is able to generalize on new datasets. Simulations made for the three land use scenarios suggested that ploughing up 33 % of grasslands would not increase significantly sediment discharge at the water treatment plant (5 % in average). In the opposite, eco-engineering and best farming practices will significantly reduce sediment discharge at the water treatment plant (respectively in the range of 10–44 and 24–61 %). The coupling of these two strategies is the most efficient since it affects the hydro-sedimentary production and transfer processes (decreasing sediment discharge from 40 to 80 %). The coupled modelling approach developed in this study offers interesting opportunities for sediment discharge prediction at karstic springs or water treatment plant under multiple land use scenarios. It also provides robust decision-making tools for land use planning and drinking water suppliers.


2020 ◽  
Author(s):  
Edouard Patault ◽  
Valentin Landemaine ◽  
Jérôme Ledun ◽  
Arnaud Soulignac ◽  
Matthieu Fournier ◽  
...  

<p>Sediment Discharge (SD) at karstic springs refers to a black-box due to the non-linearity of the processes generating SD, and the lack of accurate physical description of karstic environments. Recent research in hydrology emphasized the use of data-driven techniques for black-box models, such as Deep Learning (DL), considering their good predictive power rather than their explanatory abilities. Indeed, their integration into traditional hydrology-related workflows can be particularly promising. In this study, a deep neural network was built and coupled to an erosion-runoff GIS model (<em>WATERSED</em>, Landemaine et al., 2015) to predict SD at a karstic spring. The study site is located in the Radicatel catchment (88 km² in Normandy, France) where spring water is extracted to a Water Treatment Plant (WTP). SD was predicted for several Designed Storm Project (DSP<sub>0.5-2-10-50-100</sub>) under different land-use scenarios by 2050 (baseline, ploughing up 33% of grassland, eco-engineering (181 fascines + 13ha of grass strips), best farming practices (+20% infiltration)). Rainfall time series retrieved from French <em>SAFRAN</em> database and <em>WATERSED</em> modelling outputs extracted at connected sinkholes were used as input data for the DL model. The model structure was found by a classical trial and error procedure, and the model was trained on two significant hydrologic years (n<sub>events</sub> = 731). Evaluation on a test set suggested good performance of the model (NSE = 0.82). Additional evaluation was performed comparing the ‘Generalized Extreme Value’ (GEV) distribution for the five DSP under the baseline scenario. The SD predicted by the DL model was in perfect agreement with the GEV distribution (R² = 0.99). Application of the model on the other scenarios suggests that ploughing up 33% of grasslands will increase SD at the WTP to an average 5%. Eco-engineering and best farming practices will reduce SD in the range of 10-44% and 63-80% respectively. This novel approach offers good opportunities for SD prediction at karstic springs or WTP under multiple land use scenarios. It also provide robust decision making tools for land-use planning and drinking water suppliers.</p>


2013 ◽  
Vol 3 (4) ◽  
pp. 549-556 ◽  
Author(s):  
Kaveh Sookhak Lari ◽  
Morteza Kargar

High-rate lamella settlers in clarifiers and triple media filters have been implemented in Isfahan water treatment plant (known as ‘Baba-Sheikh-Ali’) in Iran to upgrade existing clarification/filtration processes during the recent years. The applied technologies are mainly used to reduce finished water turbidity as the primary regional criterion on water quality. However, application of both technologies faced some operational limitations since they began to work. These problems are due to the existing layout of the process units and available materials. The current study focuses on performance of restricted application of the two technologies with respect to turbidity removal. Online measured turbidity data from a two-year field observation (since March 2010) are used. In particular, results show a more promising and long-term effect on turbidity removal due to tripling filter media rather than application of the lamella settlers in clarifiers. The reasons for these observations are discussed.


2000 ◽  
Vol 42 (3-4) ◽  
pp. 403-408 ◽  
Author(s):  
R.-F. Yu ◽  
S.-F. Kang ◽  
S.-L. Liaw ◽  
M.-c. Chen

Coagulant dosing is one of the major operation costs in water treatment plant, and conventional control of this process for most plants is generally determined by the jar test. However, this method can only provide periodic information and is difficult to apply to automatic control. This paper presents the feasibility of applying artificial neural network (ANN) to automatically control the coagulant dosing in water treatment plant. Five on-line monitoring variables including turbidity (NTUin), pH (pHin) and conductivity (Conin) in raw water, effluent turbidity (NTUout) of settling tank, and alum dosage (Dos) were used to build the coagulant dosing prediction model. Three methods including regression model, time series model and ANN models were used to predict alum dosage. According to the result of this study, the regression model performed a poor prediction on coagulant dosage. Both time-series and ANN models performed precise prediction results of dosage. The ANN model with ahead coagulant dosage performed the best prediction of alum dosage with a R2 of 0.97 (RMS=0.016), very low average predicted error of 0.75 mg/L of alum were also found in the ANN model. Consequently, the application of ANN model to control the coagulant dosing is feasible in water treatment.


2019 ◽  
Vol 100 ◽  
pp. 00019 ◽  
Author(s):  
Renata Gmurkowska

During water treatment a large amount of sludge is created – in the form of sewage and sediments. The largest amounts of sludge are produced during coagulation, ozonation and backwashing rapid filters. The quality and quantity of treated water, the type and dose of used coagulants are factors affecting the quantity, composition and properties of the sludge. Sludge produced during processing of drinking water is important problem and their quantity has been increasing. The study focuses on characteristics of water treatment sludge from four water treatment plants in Cracow. It includes theoretical and experimental part. The first part is based on analysis of literature and information obtained from MPWiK [3]. The second experimental part, concerns the analysis of dry matter, organic dry matter, capillary suction time and visual parameters: the color and consistency of the sludge. Result shows that every sludge contains organic matter. The highest concentrations of organic compounds and the largest diversity has been observed in the sludge collected in the Water Treatment Plant Raba, reaching even up to 70% of organic compounds in the dry mass of sludge.


2019 ◽  
Vol 24 (1) ◽  
pp. 135-163
Author(s):  
Jader Martínez Girón ◽  
Jenny Vanessa Marín-Rivera ◽  
Mauricio Quintero-Angel

Population growth and urbanization pose a greater pressure for the treatment of drinking water. Additionally, different treatment units, such as decanters and filters, accumulate high concentrations of iron (Fe) and manganese (Mn), which in many cases can be discharged into the environment without any treatment when maintenance is performed. Therefore, this paper evaluates the effectiveness of vertical subsurface wetlands for Fe and Mn removal from wastewater in drinking water treatment plants, taking a pilot scale wetland with an ascending gravel bed with two types of plants: C. esculenta and P. australis in El Hormiguero (Cali, Colombia), as an example. The pilot system had three upstream vertical wetlands, two of them planted and the third one without a plant used as a control. The wetlands were arranged in parallel and each formed by three gravel beds of different diameter. The results showed no significant difference for the percentage of removal in the three wetlands for turbidity (98 %), Fe (90 %), dissolved Fe (97 %) and Mn (98 %). The dissolved oxygen presented a significant difference between the planted wetlands and the control. C. esculenta had the highest concentration of Fe in the root with (103.5 ± 20.8) µg/g ; while P. australis had the highest average of Fe concentrations in leaves and stem with (45.7 ± 24) µg/g and (41.4 ± 9.1) µg/g, respectively. It is concluded that subsurface wetlands can be an interesting alternative for wastewater treatment in the maintenance of drinking water treatment plants. However, more research is needed for the use of vegetation or some technologies for the removal or reduction of the pollutant load in wetlands, since each drinking water treatment plant will require a treatment system for wastewater, which in turn requires a wastewater treatment system as well.


Sign in / Sign up

Export Citation Format

Share Document