scholarly journals Using open building data in the development of exposure datasets for catastrophe risk modelling

2015 ◽  
Vol 3 (8) ◽  
pp. 5045-5084
Author(s):  
R. Figueiredo ◽  
M. Martina

Abstract. One of the necessary components to perform catastrophe risk modelling is information on the buildings at risk, such as their spatial location, geometry, height, occupancy type and other characteristics. This is commonly referred to as the exposure model or dataset. When modelling large areas, developing exposure datasets with the relevant information about every individual building is not practicable. Thus, census data at coarse spatial resolutions are often used as the starting point for the creation of such datasets, after which disaggregation to finer resolutions is carried out using different methods, based on proxies such as the population distribution. While these methods can produce acceptable results, they cannot be considered ideal. Nowadays, the availability of open data is increasing and it is possible to obtain information about buildings for some regions. Although this type of information is usually limited and, therefore, insufficient to generate an exposure dataset, it can still be very useful in its elaboration. In this paper, we focus on how open building data can be used to develop a gridded exposure model by disaggregating existing census data at coarser resolutions. Furthermore, we analyse how the selection of the level of spatial resolution can impact the accuracy and precision of the model, and compare the results in terms of affected residential building areas, due to a flood event, between different models.

2016 ◽  
Vol 16 (2) ◽  
pp. 417-429 ◽  
Author(s):  
R. Figueiredo ◽  
M. Martina

Abstract. One of the necessary components to perform catastrophe risk modelling is information on the buildings at risk, such as their spatial location, geometry, height, occupancy type and other characteristics. This is commonly referred to as the exposure model or data set. When modelling large areas, developing exposure data sets with the relevant information about every individual building is not practicable. Thus, census data at coarse spatial resolutions are often used as the starting point for the creation of such data sets, after which disaggregation to finer resolutions is carried out using different methods, based on proxies such as the population distribution. While these methods can produce acceptable results, they cannot be considered ideal. Nowadays, the availability of open data is increasing and it is possible to obtain information about buildings for some regions. Although this type of information is usually limited and, therefore, insufficient to generate an exposure data set, it can still be very useful in its elaboration. In this paper, we focus on how open building data can be used to develop a gridded exposure model by disaggregating existing census data at coarser resolutions. Furthermore, we analyse how the selection of the level of spatial resolution can impact the accuracy and precision of the model, and compare the results in terms of affected residential building areas, due to a flood event, between different models.


Author(s):  
Cecilia I. Nievas ◽  
Marco Pilz ◽  
Karsten Prehn ◽  
Danijel Schorlemmer ◽  
Graeme Weatherill ◽  
...  

AbstractThe creation of building exposure models for seismic risk assessment is frequently challenging due to the lack of availability of detailed information on building structures. Different strategies have been developed in recent years to overcome this, including the use of census data, remote sensing imagery and volunteered graphic information (VGI). This paper presents the development of a building-by-building exposure model based exclusively on openly available datasets, including both VGI and census statistics, which are defined at different levels of spatial resolution and for different moments in time. The initial model stemming purely from building-level data is enriched with statistics aggregated at the neighbourhood and city level by means of a Monte Carlo simulation that enables the generation of full realisations of damage estimates when using the exposure model in the context of an earthquake scenario calculation. Though applicable to any other region of interest where analogous datasets are available, the workflow and approach followed are explained by focusing on the case of the German city of Cologne, for which a scenario earthquake is defined and the potential damage is calculated. The resulting exposure model and damage estimates are presented, and it is shown that the latter are broadly consistent with damage data from the 1978 Albstadt earthquake, notwithstanding the differences in the scenario. Through this real-world application we demonstrate the potential of VGI and open data to be used for exposure modelling for natural risk assessment, when combined with suitable knowledge on building fragility and accounting for the inherent uncertainties.


2020 ◽  
Author(s):  
Raquel Zafrir ◽  
Massimiliano Pittore ◽  
Juan Camilo Gomez- Zapata ◽  
Patrick Aravena ◽  
Christian Geiß

<p>Residential building exposure models for risk and loss estimations related to natural hazards are usually defined in terms of specific schemas describing mutually exclusive, collectively exhaustive (MECE) classes of buildings. These models are derived from: (1) the analysis of census data or (2) by means of individual observations in the field. In the first case, expert elicitation has been conventionally used to classify the building inventory into particular schemas, usually aggregated over geographical administrative units whose size area and shape are country-specific. In the second case, especially for large urban areas, performing a visual inspection of every building in order to assign a class according to the specific schema used is a highly time- and resource intensive task, often simply unfeasible.</p> <p>Remote sensing data based on the analysis of satellite imagery has proved successful in integrating large-scale information on the built environment and as such can provide valuable vulnerability-related information, although often lacking the level of spatial and thematic resolution requested by multi-hazard applications. Volunteered Geo Information (VGI) data can also prove useful in this context, although in most cases only geometric attributes (shape of the building footprint) and some occupancy information are recorded thus leaving out most of the building attributes controlling the vulnerability of the structures to the different hazards. An additional drawback of VGI is the incompleteness of the information, which is based on the unstructured efforts of voluntary mappers.</p> <p>Former efforts have been proposing a top-down/bottom-up approach moving from regional scale to neighbourhood and per-building scale, based on the analysis and integration of different data sources at increasing spatial resolutions and thematic detail. Following the same principle, this work focuses on the downscaling of already existing building exposure models based on census data making use of a probabilistic approach based on Bayesian updating. Different aggregation models can be taken into account to increase the spatial resolution of the building exposure model, also including variable-resolution models based on geostatistical approaches. Land-use masks are first generated after a supervised classification of Sentinel-2 images, in order to better relate the built- up area to meaningful geographical entities. Two independent statistical models are then created based on prior input information. Maximum likelihood estimations are obtained for each model. Two types of auxiliary data have been employed in order to constrain the downscaling via a specific likelihood term in the Bayesian updating: 1) building footprints area from the open-source-volunteered geo-information OpenStreetMaps  and 2) built-up height and density estimators based on remote sensing developed by the DLR (the German Aerospace Agency).</p> <p>This approach, developed within the scope of the RIESGOS, was tested in Valparaiso and Viña del Mar (Chile) where the residential building exposure model proposed by the GEM-SARA project has been downscaled. The performance of the different auxiliary data were separately tested and compared. An independent building survey has also been carried out by experts from CIGIDEN (Chile) using a Rapid Remote Visual Screening Survey and used for preliminary validation of the approach.</p>


Buildings ◽  
2021 ◽  
Vol 11 (6) ◽  
pp. 242
Author(s):  
Christoph Schünemann ◽  
David Schiela ◽  
Regine Ortlepp

Can building performance simulation reproduce measured summertime indoor conditions of a multi-residential building in good conformity? This question is answered by calibrating simulated to monitored room temperatures of several rooms of a multi-residential building for an entire summer in two process steps. First, we did a calibration for several days without the residents being present to validate the building physics of the 3D simulation model. Second, the simulations were calibrated for the entire summer period, including the residents’ impact on evolving room temperature and overheating. As a result, a high degree of conformity between simulation and measurement could be achieved for all monitored rooms. The credibility of our results was secured by a detailed sensitivity analysis under varying meteorological conditions, shading situations, and window ventilation or room use in the simulation model. For top floor dwellings, a high overheating intensity was evoked by a combination of insufficient use of night-time window ventilation and non-heat-adapted residential behavior in combination with high solar gains and low heat storage capacities. Finally, the overall findings were merged into a process guideline to describe how a step-by-step calibration of residential building simulation models can be done. This guideline is intended to be a starting point for future discussions about the validity of the simplified boundary conditions which are often used in present-day standard overheating assessment.


2017 ◽  
Vol 46 (2) ◽  
pp. 207-224
Author(s):  
Ge Zhang ◽  
Wenwen Zhang ◽  
Subhrajit Guhathakurta ◽  
Nisha Botchwey

Open data have come of age with many cities, states, and other jurisdictions joining the open data movement by offering relevant information about their communities for free and easy access to the public. Despite the growing volume of open data, their use has been limited in planning scholarship and practice. The bottleneck is often the format in which the data are available and the organization of such data, which may be difficult to incorporate in existing analytical tools. The overall goal of this research is to develop an open data-based community planning support system that can collect related open data, analyze the data for specific objectives, and visualize the results to improve usability. To accomplish this goal, this study undertakes three research tasks. First, it describes the current state of open data analysis efforts in the community planning field. Second, it examines the challenges analysts experience when using open data in planning analysis. Third, it develops a new flow-based planning support system for examining neighborhood quality of life and health for the City of Atlanta as a prototype, which addresses many of these open data challenges.


Author(s):  
Xavier Franch-Auladell ◽  
Mateu Morillas-Torné ◽  
Jordi Martí-Henneberg

ABSTRACTThis paper proposes a methodology for quantifying the territorial impact on population distribution of the railway. The central hypothesis is that access to railway services provides the best-connected areas with a long-term comparative advantage over others that are less accessible. Carrying out a historical analysis and providing comparable data at the municipal level allows us to determine the extent to which the railway has fostered the concentration of population within its immediate surroundings. The case study presented here is that of Spain between 1900 and 2001, but the same methodology could equally be applied to any other country for which the required data are available. In this case, key data included a Geographic Information System with information about both the development of the railway network and census data relating to total population at the municipal level. The results obtained suggest the relevance of this methodology, which makes it possible to identify the periods and areas in which this influence was most significant.


2021 ◽  
Author(s):  
Oliver Benning ◽  
Jonathan Calles ◽  
Burak Kantarci ◽  
Shahzad Khan

This article presents a practical method for the assessment of the risk profiles of communities by tracking / acquiring, fusing and analyzing data from public transportation, district population distribution, passenger interactions and cross-locality travel data. The proposed framework fuses these data sources into a realistic simulation of a transit network for a given time span. By shedding credible insights into the impact of public transit on pandemic spread, the research findings will help to set the groundwork for tools that could provide pandemic response teams and municipalities with a robust framework for the evaluations of city districts most at risk, and how to adjust municipal services accordingly.


2019 ◽  
Vol 141 (9) ◽  
Author(s):  
J. M. Fernández Oro ◽  
J. González ◽  
R. Barrio Perotti ◽  
M. Galdo Vega

In this paper, a deterministic stress decomposition is applied over the numerical three-dimensional flow solution available for a single volute centrifugal pump. The numerical model has proven in previous publications its robustness to obtain the impeller to volute-tongue flow interaction, and it is now used as starting point for the current research. The main objective has been oriented toward a detailed analysis of the lack of uniformity in the flow that the volute tongue promotes on the blade-to-blade axisymmetric pattern. Through this analysis, the fluctuation field may be retrieved and main interaction sources have been pinpointed. The results obtained with the deterministic analysis become of paramount interest to understand the different flow features found in a typical centrifugal pump as a function of the flow rate. Moreover, this postprocessing tool provides an economic and easy procedure for designers to compare the different deterministic terms, also giving relevant information on the unresolved turbulence intensity scales. Complementarily, a way to model the turbulent effects in a systematic way is also presented, comparing their impact on the performance with respect to deterministic sources in a useful framework, that may be applied for similar kinds of pumps.


2021 ◽  
Author(s):  
Tamara Kalandadze ◽  
Sara Ann Hart

The increasing adoption of open science practices in the last decade has been changing the scientific landscape across fields. However, developmental science has been relatively slow in adopting open science practices. To address this issue, we followed the format of Crüwell et al., (2019) and created summaries and an annotated list of informative and actionable resources discussing ten topics in developmental science: Open science; Reproducibility and replication; Open data, materials and code; Open access; Preregistration; Registered reports; Replication; Incentives; Collaborative developmental science.This article offers researchers and students in developmental science a starting point for understanding how open science intersects with developmental science. After getting familiarized with this article, the developmental scientist should understand the core tenets of open and reproducible developmental science, and feel motivated to start applying open science practices in their workflow.


2021 ◽  
Author(s):  
Lupeng Wang ◽  
James P. Herman ◽  
Richard J. Krauzlis

AbstractCovert visual attention is accomplished by a cascade of mechanisms distributed across multiple brain regions. Recent studies in primates suggest a parcellation in which visual cortex is associated with enhanced representations of relevant stimuli, whereas subcortical circuits are associated with selection of visual targets and suppression of distractors. Here we identified how neuronal activity in the superior colliculus (SC) of head-fixed mice is modulated during covert visual attention. We found that spatial cues modulated both firing rate and spike-count correlations, and that the cue-related modulation in firing rate was due to enhancement of activity at the cued spatial location rather than suppression at the uncued location. This modulation improved the neuronal discriminability of visual-change-evoked activity between contralateral and ipsilateral SC neurons. Together, our findings indicate that neurons in the mouse SC contribute to covert visual selective attention by biasing processing in favor of locations expected to contain relevant information.


Sign in / Sign up

Export Citation Format

Share Document