scholarly journals When Enough Is Really Enough? On the Minimum Number of Landslides to Build Reliable Susceptibility Models

Geosciences ◽  
2021 ◽  
Vol 11 (11) ◽  
pp. 469
Author(s):  
Giacomo Titti ◽  
Cees van Westen ◽  
Lisa Borgatti ◽  
Alessandro Pasuto ◽  
Luigi Lombardo

Mapping existing landslides is a fundamental prerequisite to build any reliable susceptibility model. From a series of landslide presence/absence conditions and associated landscape characteristics, a binary classifier learns how to distinguish potentially stable and unstable slopes. In data rich areas where landslide inventories are available, addressing the collection of these can already be a challenging task. However, in data scarce contexts, where geoscientists do not get access to pre-existing inventories, the only solution is to map landslides from scratch. This operation can be extremely time-consuming if manually performed or prone to type I errors if done automatically. This is even more exacerbated if done over large geographic regions. In this manuscript we examine the issue of mapping requirements for west Tajikistan where no complete landslide inventory is available. The key question is: How many landslides should be required to develop reliable landslide susceptibility models based on statistical modeling? In fact, for such a wide and extremely complex territory, the collection of an inventory that is sufficiently detailed requires a large investment in time and human resources. However, at which point of the mapping procedure, would the resulting susceptibility model produce significantly better results as compared to a model built with less information? We addressed this question by implementing a binomial Generalized Additive Model trained and validated with different landslide proportions and measured the induced variability in the resulting susceptibility model. The results of this study are very site-specific but we proposed a very functional protocol to investigate a problem which is underestimated in the literature.

Methodology ◽  
2015 ◽  
Vol 11 (3) ◽  
pp. 110-115 ◽  
Author(s):  
Rand R. Wilcox ◽  
Jinxia Ma

Abstract. The paper compares methods that allow both within group and between group heteroscedasticity when performing all pairwise comparisons of the least squares lines associated with J independent groups. The methods are based on simple extension of results derived by Johansen (1980) and Welch (1938) in conjunction with the HC3 and HC4 estimators. The probability of one or more Type I errors is controlled using the improvement on the Bonferroni method derived by Hochberg (1988) . Results are illustrated using data from the Well Elderly 2 study, which motivated this paper.


2020 ◽  
Vol 39 (3) ◽  
pp. 185-208
Author(s):  
Qiao Xu ◽  
Rachana Kalelkar

SUMMARY This paper examines whether inaccurate going-concern opinions negatively affect the audit office's reputation. Assuming that clients perceive the incidence of going-concern opinion errors as a systematic audit quality concern within the entire audit office, we expect these inaccuracies to impact the audit office market share and dismissal rate. We find that going-concern opinion inaccuracy is negatively associated with the audit office market share and is positively associated with the audit office dismissal rate. Furthermore, we find that the decline in market share and the increase in dismissal rate are primarily associated with Type I errors. Additional analyses reveal that the negative consequence of going-concern opinion inaccuracy is lower for Big 4 audit offices. Finally, we find that the decrease in the audit office market share is explained by the distressed clients' reactions to Type I errors and audit offices' lack of ability to attract new clients.


Risks ◽  
2021 ◽  
Vol 9 (3) ◽  
pp. 53
Author(s):  
Yves Staudt ◽  
Joël Wagner

For calculating non-life insurance premiums, actuaries traditionally rely on separate severity and frequency models using covariates to explain the claims loss exposure. In this paper, we focus on the claim severity. First, we build two reference models, a generalized linear model and a generalized additive model, relying on a log-normal distribution of the severity and including the most significant factors. Thereby, we relate the continuous variables to the response in a nonlinear way. In the second step, we tune two random forest models, one for the claim severity and one for the log-transformed claim severity, where the latter requires a transformation of the predicted results. We compare the prediction performance of the different models using the relative error, the root mean squared error and the goodness-of-lift statistics in combination with goodness-of-fit statistics. In our application, we rely on a dataset of a Swiss collision insurance portfolio covering the loss exposure of the period from 2011 to 2015, and including observations from 81 309 settled claims with a total amount of CHF 184 mio. In the analysis, we use the data from 2011 to 2014 for training and from 2015 for testing. Our results indicate that the use of a log-normal transformation of the severity is not leading to performance gains with random forests. However, random forests with a log-normal transformation are the favorite choice for explaining right-skewed claims. Finally, when considering all indicators, we conclude that the generalized additive model has the best overall performance.


2019 ◽  
Vol 7 (1) ◽  
pp. 1597956
Author(s):  
Carlos Valencia ◽  
Sergio Cabrales ◽  
Laura Garcia ◽  
Juan Ramirez ◽  
Diego Calderona ◽  
...  

AMBIO ◽  
2021 ◽  
Author(s):  
Alessandro Orio ◽  
Yvette Heimbrand ◽  
Karin Limburg

AbstractThe intensified expansion of the Baltic Sea’s hypoxic zone has been proposed as one reason for the current poor status of cod (Gadus morhua) in the Baltic Sea, with repercussions throughout the food web and on ecosystem services. We examined the links between increased hypoxic areas and the decline in maximum length of Baltic cod, a demographic proxy for services generation. We analysed the effect of different predictors on maximum length of Baltic cod during 1978–2014 using a generalized additive model. The extent of minimally suitable areas for cod (oxygen concentration ≥ 1 ml l−1) is the most important predictor of decreased cod maximum length. We also show, with simulations, the potential for Baltic cod to increase its maximum length if hypoxic areal extent is reduced to levels comparable to the beginning of the 1990s. We discuss our findings in relation to ecosystem services affected by the decrease of cod maximum length.


Sign in / Sign up

Export Citation Format

Share Document