scholarly journals A Field Data Acquisition Method and Tools for Hazard Evaluation of Earthquake-Induced Landslides with Open Source Mobile GIS

2019 ◽  
Vol 8 (2) ◽  
pp. 91
Author(s):  
Mauro De Donatis ◽  
Giulio Pappafico ◽  
Roberto Romeo

The PARSIFAL (Probabilistic Approach to pRovide Scenarios of earthquake Induced slope FAiLures) method was applied to the survey of post-earthquake landslides in central Italy for seismic microzonation purposes. In order to optimize time and resources, while also reducing errors, the paper-based method of survey data sheets was translated into digital formats using such instruments as Tablet PCs, GPS and open source software (QGIS). To the base mapping consisting of Technical Regional Map (Carta Tecnica Regionale—CTRs) at the scale of 1:10,000, layers were added with such sensitive information as the Inventory of Landslide Phenomena in Italy (Inventario dei Fenomeni Franosi in Italia—IFFI), for example. A database was designed and implemented in the SQLite/SpatiaLite Relational DataBase Management System (RDBMS) to store data related to such elements as landslides, rock masses, discontinuities and covers (as provided by PARSIFAL). To facilitate capture of the datum on the ground, data entry forms were created with Qt Designer. In addition to this, the employment of some QGIS plug-ins, developed for digital surveying and enabling of quick annotations on the map and the import of images from external cameras, was found to be of considerable use.

Author(s):  
D Spallarossa ◽  
M Cattaneo ◽  
D Scafidi ◽  
M Michele ◽  
L Chiaraluce ◽  
...  

Summary The 2016–17 central Italy earthquake sequence began with the first mainshock near the town of Amatrice on August 24 (MW 6.0), and was followed by two subsequent large events near Visso on October 26 (MW 5.9) and Norcia on October 30 (MW 6.5), plus a cluster of 4 events with MW > 5.0 within few hours on January 18, 2017. The affected area had been monitored before the sequence started by the permanent Italian National Seismic Network (RSNC), and was enhanced during the sequence by temporary stations deployed by the National Institute of Geophysics and Volcanology and the British Geological Survey. By the middle of September, there was a dense network of 155 stations, with a mean separation in the epicentral area of 6–10 km, comparable to the most likely earthquake depth range in the region. This network configuration was kept stable for an entire year, producing 2.5 TB of continuous waveform recordings. Here we describe how this data was used to develop a large and comprehensive earthquake catalogue using the Complete Automatic Seismic Processor (CASP) procedure. This procedure detected more than 450,000 events in the year following the first mainshock, and determined their phase arrival times through an advanced picker engine (RSNI-Picker2), producing a set of about 7 million P- and 10 million S-wave arrival times. These were then used to locate the events using a non-linear location (NLL) algorithm, a 1D velocity model calibrated for the area, and station corrections and then to compute their local magnitudes (ML). The procedure was validated by comparison of the derived data for phase picks and earthquake parameters with a handpicked reference catalogue (hereinafter referred to as ‘RefCat’). The automated procedure takes less than 12 hours on an Intel Core-i7 workstation to analyse the primary waveform data and to detect and locate 3000 events on the most seismically active day of the sequence. This proves the concept that the CASP algorithm can provide effectively real-time data for input into daily operational earthquake forecasts, The results show that there have been significant improvements compared to RefCat obtained in the same period using manual phase picks. The number of detected and located events is higher (from 84,401 to 450,000), the magnitude of completeness is lower (from ML 1.4 to 0.6), and also the number of phase picks is greater with an average number of 72 picked arrival for a ML = 1.4 compared with 30 phases for RefCat using manual phase picking. These propagate into formal uncertainties of ± 0.9km in epicentral location and ± 1.5km in depth for the enhanced catalogue for the vast majority of the events. Together, these provide a significant improvement in the resolution of fine structures such as local planar structures and clusters, in particular the identification of shallow events occurring in parts of the crust previously thought to be inactive. The lower completeness magnitude provides a rich data set for development and testing of analysis techniques of seismic sequences evolution, including real-time, operational monitoring of b-value, time-dependent hazard evaluation and aftershock forecasting.


2006 ◽  
Vol 25 (S1) ◽  
pp. 70-71
Author(s):  
Franco Tassi ◽  
Orlando Vaselli ◽  
Elena Lognoli ◽  
Fabrizio Cuccoli ◽  
Barbara Nisi ◽  
...  

Geomorphology ◽  
1999 ◽  
Vol 31 (1-4) ◽  
pp. 181-216 ◽  
Author(s):  
Fausto Guzzetti ◽  
Alberto Carrara ◽  
Mauro Cardinali ◽  
Paola Reichenbach

2021 ◽  
Vol 21 (8) ◽  
pp. 2299-2311
Author(s):  
Andrea Antonucci ◽  
Andrea Rovida ◽  
Vera D'Amico ◽  
Dario Albarello

Abstract. The geographic distribution of earthquake effects quantified in terms of macroseismic intensities, the so-called macroseismic field, provides basic information for several applications including source characterization of pre-instrumental earthquakes and risk analysis. Macroseismic fields of past earthquakes as inferred from historical documentation may present spatial gaps, due to the incompleteness of the available information. We present a probabilistic approach aimed at integrating incomplete intensity distributions by considering the Bayesian combination of estimates provided by intensity prediction equations (IPEs) and data documented at nearby localities, accounting for the relevant uncertainties and the discrete and ordinal nature of intensity values. The performance of the proposed methodology is tested at 28 Italian localities with long and rich seismic histories and for two well-known strong earthquakes (i.e., 1980 southern Italy and 2009 central Italy events). A possible application of the approach is also illustrated relative to a 16th-century earthquake in the northern Apennines.


Author(s):  
Melanie Platz

Das Projekt ‚Prim-E-Proof‘ verfolgt das Ziel, Lernumgebungen mit digitalen Medien (Open Source Applets auf Tablet PCs) zur Unterstützung von Argumentations- und Beweisfähigkeiten in der Primarstufe zu generieren. Der Fokus des Projektes liegt darauf, klassische Lehr- und Lernprozesse mit Lernumgebungen, in denen – falls sinnvoll – digitale Medien Anwendung finden, zu unterstützen. In diesem Beitrag werden Schlussfolgerungen für die Weiterentwicklung der ersten Version der Lernumgebung auf Grund von Ergebnissen einer empirischen Untersuchung gezogen.


2020 ◽  
pp. 1222-1253
Author(s):  
Bo Yu ◽  
Duminda Wijesekera ◽  
Paulo Cesar G. Costa

Informed consents, either for treatment or sensitive information use/disclosure, that protect the privacy of patient/participant information subject to law that in certain circumstances may override patient wishes, are mandatory practice in healthcare. Similarly, for protecting and respecting research participants, informed consents are also prerequisite for human subjects research. Although the healthcare industry has widely adopted Electronic Medical Record (EMR) systems, consents are still obtained and stored primarily on paper or scanned electronic documents. Integrating a consent management system for different purposes into an EMR system involves various implementation challenges. A case study, informed consent for genetic services, is used to show how genetic informed consents placed new challenges on the traditional ethical standards of informed consent, and how appropriate consents can be electronically obtained and automatically enforced using a system that combines medical workflows and hierarchically, ontologically motivated rule enforcement. Finally, this chapter describes an implementation that uses the open-source software-based addition of these components to an open-source EMR system, so that existing systems do not need to be scrapped or otherwise rendered obsolete.


Geosciences ◽  
2020 ◽  
Vol 10 (4) ◽  
pp. 130
Author(s):  
Diana Salciarini ◽  
Evelina Volpe ◽  
Ludovica Di Pietro ◽  
Elisabetta Cattoni

Traditional technical solutions for slope stabilization are generally costly and very impacting on the natural environment and landscape. A possible alternative for improving slope stability is based on the use of naturalistic engineering techniques, characterized by a low impact on the natural environment and being able to preserve the landscape identity and peculiarities. In this work, we present an application of such techniques for slope stabilization along a greenway located in central Italy, characterized by an extraordinary natural environment. First, 22 potentially unstable slopes have been identified and examined; then, among these, two standard type slopes have been selected. For both of them, an appropriate naturalistic engineering work has been proposed and stability analyses have been carried out. These have been performed by considering different piezometric conditions and using two different approaches: (a) a classical deterministic approach, which adopts deterministic values for the mechanical properties of the soils neglecting any uncertainty, and (b) a probabilistic approach that takes into account a statistical variability of the soil property values by means of their probability density functions (PDFs). The geometry of each slope derives from a digital model of the soil with 1 meter resolution, obtained through Light Detection and Ranging (LiDAR) survey provided by the Italian Ministry of the Environment. The soil mechanical characteristics and their PDFs are derived from the geotechnical soil property database of the Perugia Province. Results show an increase in slope stability produced by the adopted countermeasures measured in terms of Factor of Safety ( F s ), Probability of Failure (PoF) and efficiency.


2014 ◽  
Vol 53 (03) ◽  
pp. 202-207 ◽  
Author(s):  
M. Haag ◽  
L. R. Pilz ◽  
D. Schrimpf

SummaryBackground: Clinical trials (CT) are in a wider sense experiments to prove and establish clinical benefit of treatments. Nowadays electronic data capture systems (EDCS) are used more often bringing a better data management and higher data quality into clinical practice. Also electronic systems for the randomization are used to assign the patients to the treatments.Objectives: If the mentioned randomization system (RS) and EDCS are used, possibly identical data are collected in both, especially by stratified randomization. This separated data storage may lead to data inconsistency and in general data samples have to be aligned. The article discusses solutions to combine RS and EDCS. In detail one approach is realized and introduced.Methods: Different possible settings of combination of EDCS and RS are determined and the pros and cons for each solution are worked out. For the combination of two independent applications the necessary interfaces for the communication are defined. Thereby, existing standards are considered. An example realization is implemented with the help of open-source applications and state-of-the-art software development procedures.Results: Three possibilities of separate usage or combination of EDCS and RS are pre -sented and assessed: i) the complete independent usage of both systems; ii) realization of one system with both functions; and iii) two separate systems, which communicate via defined interfaces. In addition a realization of our preferred approach, the combination of both systems, is introduced using the open source tools RANDI2 and Open-Clinica.Conclusion: The advantage of a flexible independent development of EDCS and RS is shown based on the fact that these tool are very different featured. In our opinion the combination of both systems via defined interfaces fulfills the requirements of randomization and electronic data capture and is feasible in practice. In addition, the use of such a setting can reduce the training costs and the error-prone duplicated data entry.


2013 ◽  
Vol 17 (8) ◽  
pp. 3159-3169 ◽  
Author(s):  
L. Brocca ◽  
S. Liersch ◽  
F. Melone ◽  
T. Moramarco ◽  
M. Volk

Abstract. A framework for a comprehensive synthetic rainfall-runoff database was developed to study catchment response to a variety of rainfall events. The framework supports effective flood risk assessment and management and implements simple approaches. It consists of three flexible components, a rainfall generator, a continuous rainfall-runoff model, and a database management system. The system was developed and tested at two gauged river sections along the upper Tiber River (central Italy). One of the main questions was to investigate how simple such approaches can be applied without impairing the quality of the results. The rainfall-runoff model was used to simulate runoff on the basis of a large number of rainfall events. The resulting rainfall-runoff database stores pre-simulated events classified on the basis of the rainfall amount, initial wetness conditions and initial discharge. The real-time operational forecasts follow an analogue method that does not need new model simulations. However, the forecasts are based on the simulation results available in the rainfall-runoff database (for the specific class to which the forecast belongs). Therefore, the database can be used as an effective tool to assess possible streamflow scenarios assuming different rainfall volumes for the following days. The application to the study site shows that magnitudes of real flood events were appropriately captured by the database. Further work should be dedicated to introduce a component for taking account of the actual temporal distribution of rainfall events into the stochastic rainfall generator and to the use of different rainfall-runoff models to enhance the usability of the proposed procedure.


Sign in / Sign up

Export Citation Format

Share Document