scholarly journals Monte Carlo calibration of avalanches described as Coulomb fluid flows

Author(s):  
Christophe Ancey

The idea that snow avalanches might behave as granular flows, and thus be described as Coulomb fluid flows, came up very early in the scientific study of avalanches, but it is not until recently that field evidence has been provided that demonstrates the reliability of this idea. This paper aims to specify the bulk frictional behaviour of snow avalanches by seeking a universal friction law. Since the bulk friction coefficient cannot be measured directly in the field, the friction coefficient must be calibrated by adjusting the model outputs to closely match the recorded data. Field data are readily available but are of poor quality and accuracy. We used Bayesian inference techniques to specify the model uncertainty relative to data uncertainty and to robustly and efficiently solve the inverse problem. A sample of 173 events taken from seven paths in the French Alps was used. The first analysis showed that the friction coefficient behaved as a random variable with a smooth and bell-shaped empirical distribution function. Evidence was provided that the friction coefficient varied with the avalanche volume, but any attempt to adjust a one-to-one relationship relating friction to volume produced residual errors that could be as large as three times the maximum uncertainty of field data. A tentative universal friction law is proposed: the friction coefficient is a random variable, the distribution of which can be approximated by a normal distribution with a volume-dependent mean.

2020 ◽  
Vol 149 ◽  
pp. 02012
Author(s):  
Boris Dobronets ◽  
Olga Popova

The article deals with the problem of calculating reliable estimates of empirical distribution functions under conditions of small sample and data uncertainty. To study these issues, we develope computational probabilistic analysis as a new area in computational statistics. We propose a new approach based on random interpolation polynomials and order statistics. Arithmetic operations on probability density functions and procedures for constructing the probabilistic extensions are used.


2020 ◽  
Vol 28 (1) ◽  
pp. 35-48
Author(s):  
Ruslan V. Pleshakov

A method for constructing an ensemble of time series trajectories with a nonstationary flow of events and a non-stationary empirical distribution of the values of the observed random variable is described. We consider a special model that is similar in properties to some real processes, such as changes in the price of a financial instrument on the exchange. It is assumed that a random process is represented as an attachment of two processes - stationary and non-stationary. That is, the length of a series of elements in the sequence of the most likely event (the most likely price change in the sequence of transactions) forms a non-stationary time series, and the length of a series of other events is a stationary random process. It is considered that the flow of events is non-stationary Poisson process. A software package that solves the problem of modeling an ensemble of trajectories of an observed random variable is described. Both the values of a random variable and the time of occurrence of the event are modeled. An example of practical application of the model is given.


1970 ◽  
Vol 1 (12) ◽  
pp. 134
Author(s):  
O.H. Shemdin ◽  
R.M. Forney

A method is proposed to investigate periodic tidal motion in single or multiple basins connected to the ocean by an inlet Non-sinusoidal tidal motion in the ocean and square friction law in the inlet are both considered The method is applied to Boca Raton inlet, Florida The calculated tidal elevation and velocity in the inlet are found to be in reasonable agreement with measured values The bottom shear friction coefficient is defined.


1999 ◽  
Vol 26 (3) ◽  
pp. 239 ◽  
Author(s):  
Pertti Hari ◽  
Annikki Mäkelä ◽  
Frank Berninger ◽  
Toivo Pohja

The ‘optimality hypothesis’ of gas exchange in plants has been studied since the 1970s, but testing it in the field has proven difficult. A recent reformulation of the hypothesis with detailing assumptions on leaf structure makes it possible to solve the optimisation problem explicitly, such that the predictions of gas exchange are readily testable against field data. This form of the model was tested against field measurements of photosynthesis, transpiration and stomatal conductance in Scots pine (Pinus sylvestris) shoots during three clear summer days. Model parameters were estimated independently from photosynthesis measurements on preceding days. The measurements were carried out at a new field measurement station with a very low level of noise. The predictions of photosynthesis, transpiration andstomatal conductance explained 84–98% of the variance in the data.


2017 ◽  
Vol 823 ◽  
pp. 278-315 ◽  
Author(s):  
A. N. Edwards ◽  
S. Viroulet ◽  
B. P. Kokelaar ◽  
J. M. N. T. Gray

Snow avalanches are typically initiated on marginally stable slopes with a surface layer of fresh snow that may easily be incorporated into them. The erosion of snow at the front is fundamental to the dynamics and growth of snow avalanches and they may rapidly bulk up, making them much more destructive than the initial release. Snow may also deposit at the rear, base and sides of the flow and the net balance of erosion and deposition determines whether an avalanche grows or decays. In this paper, small-scale analogue experiments are performed on a rough inclined plane with a static erodible layer of carborundum grains. The static layer is prepared by slowly closing down a flow from a hopper at the top of the slope. This leaves behind a uniform-depth layer of thickness $h_{stop}$ at a given slope inclination. Due to the hysteresis of the rough bed friction law, this layer can then be inclined to higher angles provided that the thickness does not exceed $h_{start}$, which is the maximum depth that can be held static on a rough bed. An avalanche is then initiated on top of the static layer by releasing a fixed volume of carborundum grains. Dependent on the slope inclination and the depth of the static layer three different behaviours are observed. For initial deposit depths above $h_{stop}$, the avalanche rapidly grows in size by progressively entraining more and more grains at the front and sides, and depositing relatively few particles at the base and tail. This leaves behind a trough eroded to a depth below the initial deposit surface and whose maximal areal extent has a triangular shape. Conversely, a release on a shallower slope, with a deposit of thickness $h_{stop}$, leads to net deposition. This time the avalanche leaves behind a levee-flanked channel, the floor of which lies above the level of the initial deposit and narrows downstream. It is also possible to generate avalanches that have a perfect balance between net erosion and deposition. These avalanches propagate perfectly steadily downslope, leaving a constant-width trail with levees flanking a shallow trough cut slightly lower than the initial deposit surface. The cross-section of the trail therefore represents an exact redistribution of the mass reworked from the initial static layer. Granular flow problems involving erosion and deposition are notoriously difficult, because there is no accepted method of modelling the phase transition between static and moving particles. Remarkably, it is shown in this paper that by combining Pouliquen & Forterre’s (J. Fluid Mech., vol. 453, 2002, pp. 133–151) extended friction law with the depth-averaged $\unicode[STIX]{x1D707}(I)$-rheology of Gray & Edwards (J. Fluid Mech., vol. 755, 2014, pp. 503–544) it is possible to develop a two-dimensional shallow-water-like avalanche model that qualitatively captures all of the experimentally observed behaviour. Furthermore, the computed wavespeed, wave peak height and stationary layer thickness, as well as the distance travelled by decaying avalanches, are all in good quantitative agreement with the experiments. This model is therefore likely to have important practical implications for modelling the initiation, growth and decay of snow avalanches for hazard assessment and risk mitigation.


2006 ◽  
Vol 82 (2) ◽  
pp. 243-252 ◽  
Author(s):  
Trisalyn Nelson ◽  
Barry Boots ◽  
Michael A Wulder

Point data generated from helicopter surveys are used to determine the location and magnitude of mountain pine beetle infestations. Although collected for tactical planning, these data also provide a rich source of information for scientific investigations. To facilitate spatial research, it is important to consider how to best represent spatially explicit mountain pine beetle infestation data. This paper focuses on the spatial representation of point-based aerial helicopter surveys, which can be difficult to represent due to issues associated with large data quantities and data uncertainty. In this paper, the benefit of using a kernel density estimator to convert point data to a continuous raster surface is demonstrated. Field data are used to assess the accuracy of the point-based aerial helicopter survey data and the kernel density estimator is extended to incorporate data uncertainty. While the accuracy of point-based aerial surveys is high, with 92.6% of points differing by no more than ± 10 trees, there is a general tendency to overestimate infestation magnitude. The method developed for incorporating uncertainty into the kernel density estimator reduces overestimation and improves the correspondence between estimated infestation intensities and field data values. Key words: mountain pine beetle, data representation, visualization, kernel density estimators, uncertainty


1975 ◽  
Vol 12 (3) ◽  
pp. 429-440 ◽  
Author(s):  
Kurt D. Eigenbrod

In a numerical analysis the pore pressure changes due to excavation of a slope and the subsequent dissipation of excess pore pressures were calculated. The analytical results of the pore pressure changes due to unloading of a slope agree reasonably well with pore pressure measurements in comparable embankments. This suggests that pore pressures immediately after slope excavation can be predicted analytically in homogeneous materials.The results of an analysis dealing with the dissipation of excess pore pressures due to unloading can also be substantiated by field evidence; however, only few comparable field data are available. For many slopes it can be noted that the time for full dissipation is of the same order of magnitude as the time between excavation and failure. This suggests that many failures might be caused by the delayed equalization of pore pressures.


Author(s):  
Ruslan V. Pleshakov

A method for constructing an ensemble of time series trajectories with a nonstationary flow of events and a non-stationary empirical distribution of the values of the observed random variable is described. We consider a special model that is similar in properties to some real processes, such as changes in the price of a financial instrument on the exchange. It is assumed that a random process is represented as an attachment of two processes - stationary and non-stationary. That is, the length of a series of elements in the sequence of the most likely event (the most likely price change in the sequence of transactions) forms a non-stationary time series, and the length of a series of other events is a stationary random process. It is considered that the flow of events is non-stationary Poisson process. A software package that solves the problem of modeling an ensemble of trajectories of an observed random variable is described. Both the values of a random variable and the time of occurrence of the event are modeled. An example of practical application of the model is given.


2021 ◽  
Author(s):  
Meng Qi ◽  
Ying Cao ◽  
Zuo-Jun (Max) Shen

Conditional quantile prediction involves estimating/predicting the quantile of a response random variable conditioned on observed covariates. The existing literature assumes the availability of independent and identically distributed (i.i.d.) samples of both the covariates and the response variable. However, such an assumption often becomes restrictive in many real-world applications. By contrast, we consider a fixed-design setting of the covariates, under which neither the response variable nor the covariates have i.i.d. samples. The present study provides a new data-driven distributionally robust framework under a fixed-design setting. We propose a regress-then-robustify method by constructing a surrogate empirical distribution of the noise. The solution of our framework coincides with a simple yet practical method that involves only regression and sorting, therefore providing an explanation for its empirical success. Measure concentration results are obtained for the surrogate empirical distribution, which further lead to finite-sample performance guarantees and asymptotic consistency. Numerical experiments are conducted to demonstrate the advantages of our approach. This paper was accepted by Hamid Nazerzadeh, Special Issue on Data-Driven Prescriptive Analytics.


Sign in / Sign up

Export Citation Format

Share Document