scholarly journals Potential of historical meteorological and hydrological data for the reconstruction of historical flood events – the example of the 1882 flood in southwest Germany

2009 ◽  
Vol 9 (1) ◽  
pp. 175-183 ◽  
Author(s):  
J. Seidel ◽  
F. Imbery ◽  
P. Dostal ◽  
D. Sudhaus ◽  
K. Bürger

Abstract. This paper presents a hydrometeorological reconstruction of the flood triggering meteorological situation and the simulation of discharges of the flood event of December 1882 in the Neckar catchment in Baden-Württemberg (southwest Germany). The course of the 1882 flood event in the Neckar catchment in southwest Germany and the weather conditions which led to this flood were reconstructed by evaluating the information from various historical sources. From these historical data, daily input data sets were derived for run-off modeling. For the determination of the precipitation pattern at the end of December 1882, the sparse historical data were modified by using a similar modern day precipitation pattern with a higher station density. The results of this run-off simulation are compared with contemporary historical data and also with 1-D hydraulic simulations using the HEC-RAS model.

2019 ◽  
Author(s):  
Anouk Bomers ◽  
Ralph Schielen ◽  
Suzanne Hulscher

Abstract. Flood frequency curves are usually highly uncertain since they are based on short data sets of measured discharges or weather conditions. To decrease the confidence intervals, an efficient bootstrap method is developed in this study. The Rhine river delta is considered as a case study. A hydraulic model is used to normalize historic flood events for anthropogenic and natural changes in the river system. As a result, the data set of measured discharges could be extended with approximately 600 years. The study shows that flood events decrease the confidence interval of the flood frequency curve significantly, specifically in the range of large floods. This even applies if the maximum discharges of these historic flood events are highly uncertain themselves.


2019 ◽  
Vol 19 (8) ◽  
pp. 1895-1908
Author(s):  
Anouk Bomers ◽  
Ralph M. J. Schielen ◽  
Suzanne J. M. H. Hulscher

Abstract. Flood frequency curves are usually highly uncertain since they are based on short data sets of measured discharges or weather conditions. To decrease the confidence intervals, an efficient bootstrap method is developed in this study. The Rhine river delta is considered as a case study. We use a hydraulic model to normalize historic flood events for anthropogenic and natural changes in the river system. As a result, the data set of measured discharges could be extended by approximately 600 years. The study shows that historic flood events decrease the confidence interval of the flood frequency curve significantly, specifically in the range of large floods. This even applies if the maximum discharges of these historic flood events are highly uncertain themselves.


Author(s):  
Daniel Green ◽  
Dapeng Yu ◽  
Ian Pattison ◽  
Robert Wilby ◽  
Lee Bosher ◽  
...  

Abstract. Emergency responders often have to operate and respond to emergency situations during dynamic weather conditions, including floods. This paper demonstrates a novel method using existing tools and datasets to evaluate emergency responder accessibility during flood events within the City of Leicester, UK. Accessibility was quantified using the 8- and 10-minute legislative targets for emergency provision for the Ambulance and Fire & Rescue services respectively under "normal", no flood conditions, as well as flood scenarios of various magnitudes (namely the 1 in 20 year-, 1 in 100-year and 1 in 1,000-year recurrence intervals), with both surface water and fluvial flood conditions considered. Flood restrictions were processed based on previous hydrodynamic inundation modelling undertaken and inputted into a Network Analysis framework as restrictions for surface water and fluvial flood events. Surface water flooding was shown to cause more disruption to emergency responders operating within the city due to its widespread and spatially distributed footprint when compared to fluvial flood events of comparable magnitude. Fire & Rescue 10-minute accessibility was shown to decrease from 100 %, 66.5 %, 39.8 % and 26.2 % under the no flood, 1 in 20-year, 1 in 100-year and 1 in 1,000-year surface water flood scenarios respectively. Furthermore, total inaccessibility was shown to increase with flood magnitude, increasing from 6.0 % to 31.0 % under the 1 in 20-year and 1 in 100-year surface water flooding scenarios respectively. Further, the evolution of emergency service accessibility through a surface water flood event is outlined, demonstrating the rapid onset of impacts on emergency service accessibility within the first 15-minutes of the surface water flood event, with a reduction in service coverage and overlap being witnessed for the Ambulance service under a 1 in 100-year flood event. The study provides evidence to guide strategic planning for decision makers prior to and during emergency response to flood events at the city-scale and provides a readily transferable method to explore the impacts of natural hazards or disruptions on additional cities or regions based on historic, scenario-based events or real-time forecasting if such data is available.


2017 ◽  
Vol 17 (1) ◽  
pp. 1-16 ◽  
Author(s):  
Daniel Green ◽  
Dapeng Yu ◽  
Ian Pattison ◽  
Robert Wilby ◽  
Lee Bosher ◽  
...  

Abstract. Emergency responders often have to operate and respond to emergency situations during dynamic weather conditions, including floods. This paper demonstrates a novel method using existing tools and datasets to evaluate emergency responder accessibility during flood events within the city of Leicester, UK. Accessibility was quantified using the 8 and 10 min legislative targets for emergency provision for the ambulance and fire and rescue services respectively under "normal" no-flood conditions, as well as flood scenarios of various magnitudes (1 in 20-year, 1 in 100-year and 1 in 1000-year recurrence intervals), with both surface water and fluvial flood conditions considered. Flood restrictions were processed based on previous hydrodynamic inundation modelling undertaken and inputted into a Network Analysis framework as restrictions for surface water and fluvial flood events. Surface water flooding was shown to cause more disruption to emergency responders operating within the city due to its widespread and spatially distributed footprint when compared to fluvial flood events of comparable magnitude. Fire and rescue 10 min accessibility was shown to decrease from 100, 66.5, 39.8 and 26.2 % under the no-flood, 1 in 20-year, 1 in 100-year and 1 in 1000-year surface water flood scenarios respectively. Furthermore, total inaccessibility was shown to increase with flood magnitude from 6.0 % under the 1 in 20-year scenario to 31.0 % under the 1 in 100-year flood scenario. Additionally, the evolution of emergency service accessibility throughout a surface water flood event is outlined, demonstrating the rapid impact on emergency service accessibility within the first 15 min of the surface water flood event, with a reduction in service coverage and overlap being observed for the ambulance service during a 1 in 100-year flood event. The study provides evidence to guide strategic planning for decision makers prior to and during emergency response to flood events at the city scale. It also provides a readily transferable method for exploring the impacts of natural hazards or disruptions in other cities or regions based on historic, scenario-based events or real-time forecasting, if such data are available.


Energies ◽  
2018 ◽  
Vol 11 (8) ◽  
pp. 2007
Author(s):  
Hassan Nemati ◽  
A. Laso ◽  
M. Manana ◽  
Anita Sant'Anna ◽  
Sławomir Nowaczyk

The maximum current that an overhead transmission line can continuously carry depends on external weather conditions, most commonly obtained from real-time streaming weather sensors. The accuracy of the sensor data is very important in order to avoid problems such as overheating. Furthermore, faulty sensor readings may cause operators to limit or even stop the energy production from renewable sources in radial networks. This paper presents a method for detecting and replacing sequences of consecutive faulty data originating from streaming weather sensors. The method is based on a combination of (a) a set of constraints obtained from derivatives in consecutive data, and (b) association rules that are automatically generated from historical data. In smart grids, a large amount of historical data from different weather stations are available but rarely used. In this work, we show that mining and analyzing this historical data provides valuable information that can be used for detecting and replacing faulty sensor readings. We compare the result of the proposed method against the exponentially weighted moving average and vector autoregression models. Experiments on data sets with real and synthetic errors demonstrate the good performance of the proposed method for monitoring weather sensors.


Author(s):  
Rémi Cura ◽  
Bertrand Duménieu ◽  
Nathalie Abadie ◽  
Benoît Costes ◽  
Julien Perret ◽  
...  

The latest developments in digital humanities have increasingly enabled the construction of large data sets which can easily be accessed and used. These data sets often contain indirect localisation information, such as historical addresses. Historical geocoding is the process of transforming the indirect localisation information to direct localisation that can be placed on a map, which enables spatial analysis and cross-referencing. Many efficient geocoders exist for current addresses, but they do not deal with temporal information and are usually based on a strict hierarchy (country, city, street, house number, etc.) that is hard, if not impossible, to use with historical data. Indeed, historical data are full of uncertainties (temporal, textual, positional accuracy, confidence in historical sources) that can not be ignored or entirely resolved. We propose an open source, open data, extensible solution for geocoding that is based on gazetteers composed of geohistorical objects extracted from historical topographical maps. Once the gazetteers are available, geocoding an historical address is a matter of finding the geohistorical object in the gazetteers that is the best match to the historical address searched by the user. The matching criteria are customisable and include several dimensions (fuzzy string, fuzzy temporal, level of detail, positional accuracy). As the goal is to facilitate historical work, we also propose web-based user interfaces that help geocode (one address or batch mode) and display over current or historical topographical maps, so that geocoding results can be checked and collaboratively edited. The system has been tested on the city of Paris, France, for the 19th and the 20th centuries. It shows high response rates and is fast enough to be used interactively.


Author(s):  
Douglas L. Dorset

The quantitative use of electron diffraction intensity data for the determination of crystal structures represents the pioneering achievement in the electron crystallography of organic molecules, an effort largely begun by B. K. Vainshtein and his co-workers. However, despite numerous representative structure analyses yielding results consistent with X-ray determination, this entire effort was viewed with considerable mistrust by many crystallographers. This was no doubt due to the rather high crystallographic R-factors reported for some structures and, more importantly, the failure to convince many skeptics that the measured intensity data were adequate for ab initio structure determinations.We have recently demonstrated the utility of these data sets for structure analyses by direct phase determination based on the probabilistic estimate of three- and four-phase structure invariant sums. Examples include the structure of diketopiperazine using Vainshtein's 3D data, a similar 3D analysis of the room temperature structure of thiourea, and a zonal determination of the urea structure, the latter also based on data collected by the Moscow group.


2021 ◽  
Vol 13 (13) ◽  
pp. 2433
Author(s):  
Shu Yang ◽  
Fengchao Peng ◽  
Sibylle von Löwis ◽  
Guðrún Nína Petersen ◽  
David Christian Finger

Doppler lidars are used worldwide for wind monitoring and recently also for the detection of aerosols. Automatic algorithms that classify the lidar signals retrieved from lidar measurements are very useful for the users. In this study, we explore the value of machine learning to classify backscattered signals from Doppler lidars using data from Iceland. We combined supervised and unsupervised machine learning algorithms with conventional lidar data processing methods and trained two models to filter noise signals and classify Doppler lidar observations into different classes, including clouds, aerosols and rain. The results reveal a high accuracy for noise identification and aerosols and clouds classification. However, precipitation detection is underestimated. The method was tested on data sets from two instruments during different weather conditions, including three dust storms during the summer of 2019. Our results reveal that this method can provide an efficient, accurate and real-time classification of lidar measurements. Accordingly, we conclude that machine learning can open new opportunities for lidar data end-users, such as aviation safety operators, to monitor dust in the vicinity of airports.


2021 ◽  
Vol 7 (s2) ◽  
Author(s):  
Alexander Bergs

Abstract This paper focuses on the micro-analysis of historical data, which allows us to investigate language use across the lifetime of individual speakers. Certain concepts, such as social network analysis or communities of practice, put individual speakers and their social embeddedness and dynamicity at the center of attention. This means that intra-speaker variation can be described and analyzed in quite some detail in certain historical data sets. The paper presents some exemplary empirical analyses of the diachronic linguistic behavior of individual speakers/writers in fifteenth to seventeenth century England. It discusses the social factors that influence this behavior, with an emphasis on the methodological and theoretical challenges and opportunities when investigating intra-speaker variation and change.


2021 ◽  
Vol 2021 (2) ◽  
Author(s):  
DianYu Liu ◽  
ChuanLe Sun ◽  
Jun Gao

Abstract The possible non-standard interactions (NSIs) of neutrinos with matter plays important role in the global determination of neutrino properties. In our study we select various data sets from LHC measurements at 13 TeV with integrated luminosities of 35 ∼ 139 fb−1, including production of a single jet, photon, W/Z boson, or charged lepton accompanied with large missing transverse momentum. We derive constraints on neutral-current NSIs with quarks imposed by different data sets in a framework of either effective operators or simplified Z′ models. We use theoretical predictions of productions induced by NSIs at next-to-leading order in QCD matched with parton showering which stabilize the theory predictions and result in more robust constraints. In a simplified Z′ model we obtain a 95% CLs upper limit on the conventional NSI strength ϵ of 0.042 and 0.0028 for a Z′ mass of 0.2 and 2 TeV respectively. We also discuss possible improvements from future runs of LHC with higher luminosities.


Sign in / Sign up

Export Citation Format

Share Document