scholarly journals The Relevance of Geotechnical-Unit Characterization for Landslide-Susceptibility Mapping with SHALSTAB

GeoHazards ◽  
2021 ◽  
Vol 2 (4) ◽  
pp. 383-397
Author(s):  
Carla Moreira Melo ◽  
Masato Kobiyama ◽  
Gean Paulo Michel ◽  
Mariana Madruga de Brito

Given the increasing occurrence of landslides worldwide, the improvement of predictive models for landslide mapping is needed. Despite the influence of geotechnical parameters on SHALSTAB model outputs, there is a lack of research on models’ performance when considering different variables. In particular, the role of geotechnical units (i.e., areas with common soil and lithology) is understudied. Indeed, the original SHALSTAB model considers that the whole basin has homogeneous soil. This can lead to the under-or-overestimation of landslide hazards. Therefore, in this study, we aimed to investigate the advantages of incorporating geotechnical units as a variable in contrast to the original model. By using locally sampled geotechnical data, 13 slope-instability scenarios were simulated for the Jaguar creek basin, Brazil. This allowed us to verify the sensitivity of the model to different input variables and assumptions. To evaluate the model performance, we used the Success Index, Error Index, ROC curve, and a new performance index: the Detective Performance Index of Unstable Areas. The best model performance was obtained in the scenario with discretized geotechnical units’ values and the largest sample size. Results indicate the importance of properly characterizing the geotechnical units when using SHALSTAB. Hence, future applications should consider this to improve models’ predictivity.

Geosciences ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 322
Author(s):  
Evelina Volpe ◽  
Luca Ciabatta ◽  
Diana Salciarini ◽  
Stefania Camici ◽  
Elisabetta Cattoni ◽  
...  

The development of forecasting models for the evaluation of potential slope instability after rainfall events represents an important issue for the scientific community. This topic has received considerable impetus due to the climate change effect on territories, as several studies demonstrate that an increase in global warming can significantly influence the landslide activity and stability conditions of natural and artificial slopes. A consolidated approach in evaluating rainfall-induced landslide hazard is based on the integration of rainfall forecasts and physically based (PB) predictive models through deterministic laws. However, considering the complex nature of the processes and the high variability of the random quantities involved, probabilistic approaches are recommended in order to obtain reliable predictions. A crucial aspect of the stochastic approach is represented by the definition of appropriate probability density functions (pdfs) to model the uncertainty of the input variables as this may have an important effect on the evaluation of the probability of failure (PoF). The role of the pdf definition on reliability analysis is discussed through a comparison of PoF maps generated using Monte Carlo (MC) simulations performed over a study area located in the Umbria region of central Italy. The study revealed that the use of uniform pdfs for the random input variables, often considered when a detailed geotechnical characterization for the soil is not available, could be inappropriate.


Mathematics ◽  
2021 ◽  
Vol 9 (5) ◽  
pp. 513
Author(s):  
Olga Fullana ◽  
Mariano González ◽  
David Toscano

In this paper, we test whether the short-run econometric conditions for the basic assumptions of the Ohlson valuation model hold, and then we relate these results with the fulfillment of the short-run econometric conditions for this model to be effective. Better future modeling motivated us to analyze to what extent the assumptions involved in this seminal model are not good enough approximations to solve the firm valuation problem, causing poor model performance. The model is based on the well-known dividend discount model and the residual income valuation model, and it adds a linear information model, which is a time series model by nature. Therefore, we adopt the time series approach. In the presence of non-stationary variables, we focus our research on US-listed firms for which more than forty years of data with the required cointegration properties to use error correction models are available. The results show that the clean surplus relation assumption has no impact on model performance, while the unbiased accounting property assumption has an important effect on it. The results also emphasize the uselessness of forcing valuation models to match the value displacement property of dividends.


Author(s):  
Tony Badrick ◽  
Mohamed Saleem ◽  
Wesley Wong

Background Reporting critical results in a timely manner is a crucial role of clinical laboratories. Traditionally, these results were reported using the phone or fax system. However, there are now other modes of communication for this reporting. Quality improvement in any organization is driven by detection of errors and benchmarking against peers. In the case of critical result reporting, there are few current widely used Benchmarking schemes. Methods The Roche Clinical Chemistry Benchmarking Survey in 2019 added questions about critical result reporting including the mode of communication and turnaround time key performance index. This survey includes over 1100 laboratories from 20 countries. Results The survey revealed a range of communication strategies with phone calls still the commonest followed by email. The key performance index for most laboratories was less than 10 min. Conclusion Benchmarking can provide key information for quality improvement activities, particularly pre- and postanalytical.


Water ◽  
2020 ◽  
Vol 12 (5) ◽  
pp. 1279
Author(s):  
Tyler Madsen ◽  
Kristie Franz ◽  
Terri Hogue

Demand for reliable estimates of streamflow has increased as society becomes more susceptible to climatic extremes such as droughts and flooding, especially at small scales where local population centers and infrastructure can be affected by rapidly occurring events. In the current study, the Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM) (NOAA/NWS, Silver Spring, MD, USA) was used to explore the accuracy of a distributed hydrologic model to simulate discharge at watershed scales ranging from 20 to 2500 km2. The model was calibrated and validated using observed discharge data at the basin outlets, and discharge at uncalibrated subbasin locations was evaluated. Two precipitation products with nominal spatial resolutions of 12.5 km and 4 km were tested to characterize the role of input resolution on the discharge simulations. In general, model performance decreased as basin size decreased. When sub-basin area was less than 250 km2 or 20–40% of the total watershed area, model performance dropped below the defined acceptable levels. Simulations forced with the lower resolution precipitation product had better model evaluation statistics; for example, the Nash–Sutcliffe efficiency (NSE) scores ranged from 0.50 to 0.67 for the verification period for basin outlets, compared to scores that ranged from 0.33 to 0.52 for the higher spatial resolution forcing.


Water ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 438
Author(s):  
Jose Luis Diaz-Hernandez ◽  
Antonio Jose Herrera-Martinez

At present, there is a lack of detailed understanding on how the factors converging on water variables from mountain areas modify the quantity and quality of their watercourses, which are features determining these areas’ hydrological contribution to downstream regions. In order to remedy this situation to some extent, we studied the water-bodies of the western sector of the Sierra Nevada massif (Spain). Since thaw is a necessary but not sufficient contributor to the formation of these fragile water-bodies, we carried out field visits to identify their number, size and spatial distribution as well as their different modelling processes. The best-defined water-bodies were the result of glacial processes, such as overdeepening and moraine dams. These water-bodies are the highest in the massif (2918 m mean altitude), the largest and the deepest, making up 72% of the total. Another group is formed by hillside instability phenomena, which are very dynamic and are related to a variety of processes. The resulting water-bodies are irregular and located at lower altitudes (2842 m mean altitude), representing 25% of the total. The third group is the smallest (3%), with one subgroup formed by anthropic causes and another formed from unknown origin. It has recently been found that the Mediterranean and Atlantic watersheds of this massif are somewhat paradoxical in behaviour, since, despite its higher xericity, the Mediterranean watershed generally has higher water contents than the Atlantic. The overall cause of these discrepancies between watersheds is not connected to their formation processes. However, we found that the classification of water volumes by the manners of formation of their water-bodies is not coherent with the associated green fringes because of the anomalous behaviour of the water-bodies formed by moraine dams. This discrepancy is largely due to the passive role of the water retained in this type of water-body as it depends on the characteristics of its hollows. The water-bodies of Sierra Nevada close to the peak line (2918 m mean altitude) are therefore highly dependent on the glacial processes that created the hollows in which they are located. Slope instability created water-bodies mainly located at lower altitudes (2842 m mean altitude), representing tectonic weak zones or accumulation of debris, which are influenced by intense slope dynamics. These water-bodies are therefore more fragile, and their existence is probably more short-lived than that of bodies created under glacial conditions.


2021 ◽  
Author(s):  
Gustavo Otranto-da-Silva

<div> <p><span>A city's response to a rainfall event depends not only on the rainfall spatial-temporal variability, but also on the spatial distribution and the initial state of its Blue Green Solutions (BGS), such as green roofs. They hold back runoff and may prove being critically important elements of blue-green build environment.</span></p> </div><div> <p><span>The aim of this study was first to adapt the existing hydrological model to the urban area of Melun (France), to validate it and then to assess numerically an optimal configuration of green roofs to mitigate pluvial floods for particularly vulnerable areas. </span><span>The main focus was put on the investigation of interactions between rainfall space-time scales and resulting hydrological response over fine scales, all being controlled by the performance assessment of BGS. </span></p> </div><div> <p><span>This presentation will particularly illustrate how fractal </span><span>tools were used to:</span></p> </div><div> <p><span>- highlight the scale dependency of the input variables and its e</span><span>ff</span><span>ects on gridded model performance;</span></p> </div><div> <p><span>- explore, </span><span>analyse</span><span> and represent the influence of BGS location and configuration on the mitigation of runoff associated with short-duration, high-intensity rainfall at neighborhood scale;</span></p> </div><div> <p><span> - identify the urban design options that maximize the potential for runoff reduction</span><span>. </span></p> </div><div> <p><span>In overall, these </span><span>results may serve as a referential </span><span>for upscaling the optimized implementation of BGS in urban areas, by considering other urban infrastructures and their interactions.</span></p> </div>


2021 ◽  
Author(s):  
Elzbieta Wisniewski ◽  
Wit Wisniewski

<p>The presented research examines what minimum combination of input variables are required to obtain state-of-the-art fractional snow cover (FSC) estimates for heterogeneous alpine-forested terrains. Currently, one of the most accurate FSC estimators for alpine regions is based on training an Artificial Neural Network (ANN) that can deconvolve the relationships among numerous compounded and possibly non-linear bio-geophysical relations encountered in alpine terrain. Under the assumption that the ANN optimally extracts available information from its input data, we can exploit the ANN as a tool to assess the contributions toward FSC estimation of each of the data sources, and combinations thereof. By assessing the quality of the modeled FSC estimates versus ground equivalent data, suitable combinations of input variables can be identified. High spatial resolution IKONOS images are used to estimate snow cover for ANN training and validation, and also for error assessment of the ANN FSC results. Input variables are initially chosen representing information already incorporated into leading snow cover estimators (ex. two multispectral bands for NDSI, etc.). Additional variables such as topographic slope, aspect, and shadow distribution are evaluated to observe the ANN as it accounts for illumination incidence and directional reflectance of surfaces affecting the viewed radiance in complex terrain. Snow usually covers vegetation and underlying geology partially, therefore the ANN also has to resolve spectral mixtures of unobscured surfaces surrounded by snow. Multispectral imagery if therefore acquired in the fall prior to the first snow of the season and are included in the ANN analyses for assessing the baseline reflectance values of the environment that later become modified by the snow. In this study, nine representative scenarios of input data are selected to analyze the FSC performance. Numerous selections of input data combinations produced good results attesting to the powerful ability of ANNs to extract information and utilize redundancy. The best ANN FSC model performance was achieved when all 15 pre-selected inputs were used. The need for non-linear modeling to estimate FSC was verified by forcing the ANN to behave linearly. The linear ANN model exhibited profoundly decreased FSC performance, indicating that non-linear processing more optimally estimates FSC in alpine-forested environments.</p>


2021 ◽  
Author(s):  
Sophia Eugeni ◽  
Eric Vaags ◽  
Steven V. Weijs

<p>Accurate hydrologic modelling is critical to effective water resource management. As catchment attributes strongly influence the hydrologic behaviors in an area, they can be used to inform hydrologic models to better predict the discharge in a basin. Some basins may be more difficult to accurately predict than others. The difficulty in predicting discharge may also be related to the complexity of the discharge signal. The study establishes the relationship between a catchment’s static attributes and hydrologic model performance in those catchments, and also investigates the link to complexity, which we quantify with measures of compressibility based in information theory. </p><p>The project analyzes a large national dataset, comprised of catchment attributes for basins across the United States, paired with established performance metrics for corresponding hydrologic models. Principal Component Analysis (PCA) was completed on the catchment attributes data to determine the strongest modes in the input. The basins were clustered according to their catchment attributes and the performance within the clusters was compared. </p><p>Significant differences in model performance emerged between the clusters of basins. For the complexity analysis, details of the implementation and technical challenges will be discussed, as well as preliminary results.</p>


2020 ◽  
Author(s):  
Sandip Som ◽  
Saibal Ghosh ◽  
Soumitra Dasgupta ◽  
Thrideep Kumar ◽  
J. N. Hindayar ◽  
...  

Abstract Modeling landslide susceptibility is one of the important aspects of land use planning and risk management. Several modeling methods are available based either on highly specialized knowledge on causative attributes or on good landslide inventory data to use as training and testing attribute on model development. Understandably, these two criteria are rarely available for local land regulators. This paper presents a new model methodology, which requires minimum knowledge of causative attributes and does not depend on landslide inventory. As landslide causes due to the combined effect of causative attributes, this model utilizes communality (common variance) of the attributes, extracted by exploratory factor analysis and used for calculation of landslide susceptibility index. The model can understand the inter-relationship of different geo-environmental attributes responsible for landslide along with identification and prioritization of attributes on model performance to delineate non-performing attributes. Finally, the model performance is compared with the well established AHP method (knowledge driven) and FRM method (data driven) by cut-off independent ROC curves along with cost-effectiveness. The model shows it’s performance almost at par with the established models, involving minimum modeling expertise. The findings and results of the present work will be helpful for the town planners and engineers on a regional scale for generalized planning and assessment.


Sign in / Sign up

Export Citation Format

Share Document