scholarly journals The influence of grid resolution on the prediction of natural and road-related shallow landslides

2014 ◽  
Vol 18 (6) ◽  
pp. 2127-2139 ◽  
Author(s):  
D. Penna ◽  
M. Borga ◽  
G. T. Aronica ◽  
G. Brigandì ◽  
P. Tarolli

Abstract. This work evaluates the predictive power of the quasi-dynamic shallow landslide model QD-SLaM to simulate shallow landslide locations in a small-scale Mediterranean landscape, namely, the lower portion (2.6 km2) of the Giampilieri catchment, located in Sicily (Italy). The catchment was impacted by a sequence of high-intensity storms over the years 2007–2009, resulting in widespread landsliding, with a total landslide initiation area amounting to 2.6% of the basin area. The effect of high-resolution digital terrain models (DTMs) on the quality of model predictions is tested by considering four DTM resolutions: 2, 4, 10 and 20 m. Moreover, the impact of the dense forest road network on the model performance is evaluated by separately considering road-related landslides and natural landslides. The landslide model does not incorporate the description of road-related failures and is applied without calibration of the model parameters. The model predictive power is shown to be DTM-resolution dependent. Use of coarser resolution has a smoothing effect on terrain attributes, with local slope angles decreasing and contributing areas becoming larger. The percentage of watershed area represented by the model as unconditionally unstable (i.e. failing even without the addition of water from precipitation) ranges between 6.3% at 20 m DTM and 13.8% at 2 m DTM, showing an overestimation of the mapped landslide area. We consider this prediction as an indication for likely failing sites in future storms rather than areas proved stable during previous storms. When assessed over the sample of mapped non-road-related landslides, better model performances are reported for 4 and 10 m DTM resolution, thus highlighting the fact that higher DTM resolution does not necessarily mean better model performances. Model performances over road-related failures are lower than for the natural cases, and slightly increase with decreasing DTM resolution. These findings indicate that to realize the full potential of high-resolution topography, more extensive work is needed aiming more specifically to identify the extent of the artificial structures and their impact on shallow landsliding processes.

2013 ◽  
Vol 10 (7) ◽  
pp. 9761-9798 ◽  
Author(s):  
D. Penna ◽  
M. Borga ◽  
G. T. Aronica ◽  
G. Brigandì ◽  
P. Tarolli

Abstract. This work evaluates the predictive power of the quasi-dynamic shallow landslide model QD-SLaM to simulate shallow landslide locations in a small-scale Mediterranean landscape: the Giampilieri catchment located in Sicily (Italy). The catchment was impacted by a sequence of high-intensity storms over the years 2007–2009. The effect of high resolution Digital Terrain Models (DTMs) on the quality of model predictions is tested by considering four DTM resolutions: 2 m, 4 m, 10 m and 20 m. Moreover, the impact of the dense forest road network on the model performance is evaluated by considering separately road-related landslides and natural landslides. The landslide model does not incorporate the description of road-related failures. The model predictive power is shown to be DTM-resolution dependent. When assessed over the sample of mapped natural landslides, better model performances are reported for 4 m and 10 m DTM resolution, thus highlighting the fact that higher DTM resolution does not necessarily mean better model performances. Model performances over road-related failures are, as expected, lower than for the other cases. These findings show that shallow landslide predictive power can benefit from increasing DTM resolution only when the model is able to describe the physical processes emerging at the smaller spatial scales resolved by the digital topography. Model results show also that the combined use of high DTM resolution and a model capable to deal with road-related processes may lead to substantially better performances in landscapes where forest roads are a significant factor of slope stability.


Hydrology ◽  
2021 ◽  
Vol 8 (3) ◽  
pp. 102
Author(s):  
Frauke Kachholz ◽  
Jens Tränckner

Land use changes influence the water balance and often increase surface runoff. The resulting impacts on river flow, water level, and flood should be identified beforehand in the phase of spatial planning. In two consecutive papers, we develop a model-based decision support system for quantifying the hydrological and stream hydraulic impacts of land use changes. Part 1 presents the semi-automatic set-up of physically based hydrological and hydraulic models on the basis of geodata analysis for the current state. Appropriate hydrological model parameters for ungauged catchments are derived by a transfer from a calibrated model. In the regarded lowland river basins, parameters of surface and groundwater inflow turned out to be particularly important. While the calibration delivers very good to good model results for flow (Evol =2.4%, R = 0.84, NSE = 0.84), the model performance is good to satisfactory (Evol = −9.6%, R = 0.88, NSE = 0.59) in a different river system parametrized with the transfer procedure. After transferring the concept to a larger area with various small rivers, the current state is analyzed by running simulations based on statistical rainfall scenarios. Results include watercourse section-specific capacities and excess volumes in case of flooding. The developed approach can relatively quickly generate physically reliable and spatially high-resolution results. Part 2 builds on the data generated in part 1 and presents the subsequent approach to assess hydrologic/hydrodynamic impacts of potential land use changes.


2017 ◽  
Vol 10 (3) ◽  
pp. 1383-1402 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate – specifically the Madden–Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).


2016 ◽  
Author(s):  
Paolo Davini ◽  
Jost von Hardenberg ◽  
Susanna Corti ◽  
Hannah M. Christensen ◽  
Stephan Juricke ◽  
...  

Abstract. The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth-System Model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 km up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PBytes of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Center (LRZ) in Garching, Germany. About 140 TBytes of post-processed data are stored on the CINECA supercomputing center archives and are freely accessible to the community thanks to an EUDAT Data Pilot project. This paper presents the technical and scientific setup of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given: an improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increases is observed. It is also shown that including stochastic parameterisation in the low resolution runs helps to improve some aspects of the tropical climate – specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small scale processes on the large scale climate variability either explicitly (with high resolution simulations) or stochastically (in low resolution simulations).


2016 ◽  
Author(s):  
R. J. Haarsma ◽  
M. Roberts ◽  
P. L. Vidale ◽  
C. A. Senior ◽  
A. Bellucci ◽  
...  

Abstract. Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest the possibility for significant changes in both large-scale aspects of circulation, as well as improvements in small-scale processes and extremes. However, such high resolution global simulations at climate time scales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centers and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other MIPs. Increases in High Performance Computing (HPC) resources, as well as the revised experimental design for CMIP6, now enables a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility to extend to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulation. HighResMIP thereby focuses on one of the CMIP6 broad questions: “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.


2019 ◽  
Vol 491 (2) ◽  
pp. 1600-1621
Author(s):  
Yi Mao ◽  
Jun Koda ◽  
Paul R Shapiro ◽  
Ilian T Iliev ◽  
Garrelt Mellema ◽  
...  

ABSTRACT Cosmic reionization was driven by the imbalance between early sources and sinks of ionizing radiation, both of which were dominated by small-scale structure and are thus usually treated in cosmological reionization simulations by subgrid modelling. The recombination rate of intergalactic hydrogen is customarily boosted by a subgrid clumping factor, 〈n2〉/〈n〉2, which corrects for unresolved fluctuations in gas density n on scales below the grid-spacing of coarse-grained simulations. We investigate in detail the impact of this inhomogeneous subgrid clumping on reionization and its observables, as follows: (1) Previous attempts generally underestimated the clumping factor because of insufficient mass resolution. We perform a high-resolution N-body simulation that resolves haloes down to the pre-reionization Jeans mass to derive the time-dependent, spatially varying local clumping factor and a fitting formula for its correlation with local overdensity. (2) We then perform a large-scale N-body and radiative transfer simulation that accounts for this inhomogeneous subgrid clumping by applying this clumping factor-overdensity correlation. Boosting recombination significantly slows the expansion of ionized regions, which delays completion of reionization and suppresses 21 cm power spectra on large scales in the later stages of reionization. (3) We also consider a simplified prescription in which the globally averaged, time-evolving clumping factor from the same high-resolution N-body simulation is applied uniformly to all cells in the reionization simulation, instead. Observables computed with this model agree fairly well with those from the inhomogeneous clumping model, e.g. predicting 21 cm power spectra to within 20 per cent error, suggesting it may be a useful approximation.


2018 ◽  
Vol 15 (8) ◽  
pp. 2525-2549 ◽  
Author(s):  
Anne Peukert ◽  
Timm Schoening ◽  
Evangelos Alevizos ◽  
Kevin Köser ◽  
Tom Kwasnitschka ◽  
...  

Abstract. In this study, ship- and autonomous underwater vehicle (AUV)-based multibeam data from the German ferromanganese-nodule (Mn-nodule) license area in the Clarion–Clipperton Zone (CCZ; eastern Pacific) are linked to ground-truth data from optical imaging. Photographs obtained by an AUV enable semi-quantitative assessments of nodule coverage at a spatial resolution in the range of meters. Together with high-resolution AUV bathymetry, this revealed a correlation of small-scale terrain variations (< 5 m horizontally, < 1 m vertically) with nodule coverage. In the presented data set, increased nodule coverage could be correlated with slopes > 1.8∘ and concave terrain. On a more regional scale, factors such as the geological setting (existence of horst and graben structures, sediment thickness, outcropping basement) and influence of bottom currents seem to play an essential role for the spatial variation of nodule coverage and the related hard substrate habitat. AUV imagery was also successfully employed to map the distribution of resettled sediment following a disturbance and sediment cloud generation during a sampling deployment of an epibenthic sledge. Data from before and after the “disturbance” allow a direct assessment of the impact. Automated image processing analyzed the nodule coverage at the seafloor, revealing nodule blanketing by resettling of suspended sediment within 16 h after the disturbance. The visually detectable impact was spatially limited to a maximum of 100 m distance from the disturbance track, downstream of the bottom water current. A correlation with high-resolution AUV bathymetry reveals that the blanketing pattern varies in extent by tens of meters, strictly following the bathymetry, even in areas of only slightly undulating seafloor (<1 m vertical change). These results highlight the importance of detailed terrain knowledge when engaging in resource assessment studies for nodule abundance estimates and defining mineable areas. At the same time, it shows the importance of high-resolution mapping for detailed benthic habitat studies that show a heterogeneity at scales of 10 to 100 m. Terrain knowledge is also needed to determine the scale of the impact by seafloor sediment blanketing during mining operations.


2015 ◽  
Vol 52 (9) ◽  
pp. 1360-1373 ◽  
Author(s):  
Valentin S. Gischig ◽  
Oldrich Hungr ◽  
Andrew Mitchell ◽  
Franck Bourrier

The use of dynamic computational methods has become indispensable for addressing problems related to rockfall hazard. Although a number of models with various degrees of complexity are available, model parameters are rarely calibrated against observations from rockfall experiments. A major difficulty lies in reproducing the apparent randomness of the impact process related to both ground and block irregularities. Calibration of rigorous methods capable of explicitly modeling trajectories and impact physics of irregular blocks is difficult, as parameter spaces become too vast and the quality of model input and observation data are insufficient. The model presented here returns to the simple “lumped-mass” approach and simulates the characteristic randomness of rockfall impact as a stochastic process. Despite similarities to existing approaches, the model presented here incorporates several novel concepts: (i) ground roughness and particle roughness are represented as a random change of slope angle at impact; (ii) lateral deviations of rebound direction from the trajectory plane at impact are similarly accounted for by perturbing the ground orientation laterally, thus inducing scatter of run-out directions; and (iii) a hyperbolic relationship connects restitution factors to impact deformation energy. With these features, the model is capable of realistically accounting for the influence of particle mass on dynamic behaviour. The model only requires four input parameters, rendering it flexible for calibration against observed datasets. In this study, we calibrate the model against observations from the rockfall test site at Vaujany in France. The model is able to reproduce observed distributions of velocity, jump heights, and runout at observation points. In addition, the spatial distribution of the trajectories and landing points has been successfully simulated. Different parameter sets have been used for different ground materials such as an avalanche channel, a forest road, and a talus cone. Further calibration of the new model against a range of field datasets is essential. This study is part of an extensive calibration program that is still in progress at this first presentation of the method, and focuses on fine-tuning the details of the stochastic process implemented both in two-dimensional (2D) and three-dimensional (3D) versions of the model.


2016 ◽  
Vol 43 (3) ◽  
pp. 342-355 ◽  
Author(s):  
Liyuan Sun ◽  
Yadong Zhou ◽  
Xiaohong Guan

Understanding information propagation in online social networks is important in many practical applications and is of great interest to many researchers. The challenge with the existing propagation models lies in the requirement of complete network structure, topic-dependent model parameters and topic isolated spread assumption, etc. In this paper, we study the characteristics of multi-topic information propagation based on the data collected from Sina Weibo, one of the most popular microblogging services in China. We find that the daily total amount of user resources is finite and users’ attention transfers from one topic to another. This shows evidence on the competitions between multiple dynamical topics. According to these empirical observations, we develop a competition-based multi-topic information propagation model without social network structure. This model is built based on general mechanisms of resource competitions, i.e. attracting and distracting users’ attention, and considers the interactions of multiple topics. Simulation results show that the model can effectively produce topics with temporal popularity similar to the real data. The impact of model parameters is also analysed. It is found that topic arrival rate reflects the strength of competitions, and topic fitness is significant in modelling the small scale topic propagation.


2016 ◽  
Vol 9 (11) ◽  
pp. 4185-4208 ◽  
Author(s):  
Reindert J. Haarsma ◽  
Malcolm J. Roberts ◽  
Pier Luigi Vidale ◽  
Catherine A. Senior ◽  
Alessio Bellucci ◽  
...  

Abstract. Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950–2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. HighResMIP thereby focuses on one of the CMIP6 broad questions, “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.


Sign in / Sign up

Export Citation Format

Share Document