scholarly journals AN EFFICIENT ND-POINT DATA STRUCTURE FOR QUERYING FLOOD RISK

Author(s):  
H. Liu ◽  
P. Van Oosterom ◽  
B. Mao ◽  
M. Meijers ◽  
R. Thompson

Abstract. Governments use flood maps for city planning and disaster management to protect people and assets. Flood risk mapping projects carried out for these purposes generate a huge amount of modelling results. Previously, data submitted are highly condensed products such as typical flood inundation maps and tables for loss analysis. Original modelling results recording critical flood evolution processes are overlooked due to cumbersome management and analysis. This certainly has drawbacks: the ‘static’ maps impart few details about the flood; also, the data fails to address new requirements. This significantly confines the use of flood maps. Recent development of point cloud databases provides an opportunity to manage the whole set of modelling results. The databases can efficiently support all kinds of flood risk queries at finer scales. Using a case study from China, this paper demonstrates how a novel nD-PointCloud structure, HistSFC, improves flood risk querying. The result indicates that compared with conventional database solutions, HistSFC holds superior performance and better scalability. Besides, the specific optimizations made on HistSFC can facilitate the process further. All these indicate a promising solution for the next generation of flood maps.

10.29007/l6jd ◽  
2018 ◽  
Author(s):  
Laurent Guillaume Courty ◽  
Jose Agustín Breña-Naranjo ◽  
Adrián Pedrozo-Acuña

We present a flood risk mapping framework created in the context of the update of the Mexican flood risk atlas. This framework is based on a nation-wide GIS database of map time-series. Those maps are used as forcing for a deterministic, raster-based numerical model. For each catchment of interest, the model retrieves the data from the GIS and perform the computation on the specified area. The results are written directly in the GIS database, which facilitate their post-processing. This methodology allows 1) the generation of flood risk maps in cities located across the national territory, without too much effort in the pre and post-processing of information and 2) a very efficient process to create new flood maps for urban areas that have not been included in the original batch.


Water ◽  
2021 ◽  
Vol 13 (5) ◽  
pp. 666
Author(s):  
Mahkameh Zarekarizi ◽  
K. Joel Roop-Eckart ◽  
Sanjib Sharma ◽  
Klaus Keller

Understanding flood probabilities is essential to making sound decisions about flood-risk management. Many people rely on flood probability maps to inform decisions about purchasing flood insurance, buying or selling real-estate, flood-proofing a house, or managing floodplain development. Current flood probability maps typically use flood zones (for example the 1 in 100 or 1 in 500-year flood zones) to communicate flooding probabilities. However, this choice of communication format can miss important details and lead to biased risk assessments. Here we develop, test, and demonstrate the FLOod Probability Interpolation Tool (FLOPIT). FLOPIT interpolates flood probabilities between water surface elevation to produce continuous flood-probability maps. FLOPIT uses water surface elevation inundation maps for at least two return periods and creates Annual Exceedance Probability (AEP) as well as inundation maps for new return levels. Potential advantages of FLOPIT include being open-source, relatively easy to implement, capable of creating inundation maps from agencies other than FEMA, and applicable to locations where FEMA published flood inundation maps but not flood probability. Using publicly available data from the Federal Emergency Management Agency (FEMA) flood risk databases as well as state and national datasets, we produce continuous flood-probability maps at three example locations in the United States: Houston (TX), Muncy (PA), and Selinsgrove (PA). We find that the discrete flood zones generally communicate substantially lower flood probabilities than the continuous estimates.


2011 ◽  
Vol 121-126 ◽  
pp. 1220-1225
Author(s):  
Guo Guang Wang ◽  
Qiao Lun Huang ◽  
Jing Ya Yuan

China currently is undergoing an unprecedented urbanization process which is accompanied by a severe damage to the environment. Cradle to Cradle Design approach has been gaining increasing interest among industries, authorities and consumers over these years. Its compelling design principles make people believe that adopting it in village planning or city planning is a very promising solution to China’s urbanization. This paper not only illustrates the features of ecovilllage and the design principles of Cradle to Cradle Design but also investigate the situation of first high-profile Cradle to Cradle planning project in China.


Author(s):  
Nathalie Saint-Geours ◽  
Jean-Stéphane Bailly ◽  
Frédéric Grelot ◽  
Christian Lavergne

2021 ◽  
Vol 2021 ◽  
pp. 1-6
Author(s):  
Xue Chen ◽  
Yuanyuan Shi ◽  
Yanjun Wang ◽  
Yuanjuan Cheng

This paper mainly introduces the relevant contents of automatic assessment of upper limb mobility after stroke, including the relevant knowledge of clinical assessment of upper limb mobility, Kinect sensor to realize spatial location tracking of upper limb bone points, and GCRNN model construction process. Through the detailed analysis of all FMA evaluation items, a unique experimental data acquisition environment and evaluation tasks were set up, and the results of FMA prediction using bone point data of each evaluation task were obtained. Through different number and combination of tasks, the best coefficient of determination was achieved when task 1, task 2, and task 5 were simultaneously used as input for FMA prediction. At the same time, in order to verify the superior performance of the proposed method, a comparative experiment was set with LSTM, CNN, and other deep learning algorithms widely used. Conclusion. GCRNN was able to extract the motion features of the upper limb during the process of movement from the two dimensions of space and time and finally reached the best prediction performance with a coefficient of determination of 0.89.


2021 ◽  
Author(s):  
Enes Yildirim ◽  
Ibrahim Demir

Flood risk assessment contributes to identifying at-risk communities and supports mitigation decisions to maximize benefits from the investments. Large-scale risk assessments generate invaluable inputs for prioritizing regions for the distribution of limited resources. High-resolution flood maps and accurate parcel information are critical for flood risk analysis to generate reliable outcomes for planning, preparedness, and decision-making applications. Large-scale damage assessment studies in the United States often utilize the National Structure Inventory (NSI) or HAZUS default dataset, which results in inaccurate risk estimates due to the low geospatial accuracy of these datasets. On the other hand, some studies utilize higher resolution datasets, however they are limited to focus on small scales, for example, a city or a Hydrological United Code (HUC)-12 watershed. In this study, we collected extensive detailed flood maps and parcel datasets for many communities in Iowa to carry out a large-scale flood risk assessment. High-resolution flood maps and the most recent parcel information are collected to ensure the accuracy of risk products. The results indicate that the Eastern Iowa communities are prone to a higher risk of direct flood losses. Our model estimates nearly $10 million in average annualized losses, particularly in large communities in the study region. The study highlights that existing risk products based on FEMA's flood risk output underestimate the flood loss, specifically in highly populated urban communities such as Bettendorf, Cedar Falls, Davenport, Dubuque, and Waterloo. Additionally, we propose a flood risk score methodology for two spatial scales (e.g., HUC-12 watershed, property) to prioritize regions and properties for mitigation purposes. Lastly, the watershed-scale study results are shared through a web-based platform to inform the decision-makers and the public.


2019 ◽  
Vol 20 (11) ◽  
pp. 2203-2214 ◽  
Author(s):  
Hoang Tran ◽  
Phu Nguyen ◽  
Mohammed Ombadi ◽  
Kuolin Hsu ◽  
Soroosh Sorooshian ◽  
...  

Abstract Flood mapping from satellites provides large-scale observations of flood events, but cloud obstruction in satellite optical sensors limits its practical usability. In this study, we implemented the Variational Interpolation (VI) algorithm to remove clouds from NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) Snow-Covered Area (SCA) products. The VI algorithm estimated states of cloud-hindered pixels by constructing three-dimensional space–time surfaces based on assumptions of snow persistence. The resulting cloud-free flood maps, while maintaining the temporal resolution of the original MODIS product, showed an improvement of nearly 70% in average probability of detection (POD) (from 0.29 to 0.49) when validated with flood maps derived from Landsat-8 imagery. The second part of this study utilized the cloud-free flood maps for calibration of a hydrologic model to improve simulation of flood inundation maps. The results demonstrated the utility of the cloud-free maps, as simulated inundation maps had average POD, false alarm ratio (FAR), and Hanssen–Kuipers (HK) skill score of 0.87, 0.49, and 0.84, respectively, compared to POD, FAR, and HK of 0.70, 0.61, and 0.67 when original maps were used for calibration.


Sign in / Sign up

Export Citation Format

Share Document