scholarly journals Probabilistic approach to rock fall hazard assessment: potential of historical data analysis

2002 ◽  
Vol 2 (1/2) ◽  
pp. 15-26 ◽  
Author(s):  
C. Dussauge-Peisser ◽  
A. Helmstetter ◽  
J.-R. Grasso ◽  
D. Hantz ◽  
P. Desvarreux ◽  
...  

Abstract. We study the rock fall volume distribution for three rock fall inventories and we fit the observed data by a power-law distribution, which has recently been proposed to describe landslide and rock fall volume distributions, and is also observed for many other natural phenomena, such as volcanic eruptions or earthquakes. We use these statistical distributions of past events to estimate rock fall occurrence rates on the studied areas. It is an alternative to deterministic approaches, which have not proved successful in predicting individual rock falls. The first one concerns calcareous cliffs around Grenoble, French Alps, from 1935 to 1995. The second data set is gathered during the 1912–1992 time window in Yosemite Valley, USA, in granite cliffs. The third one covers the 1954–1976 period in the Arly gorges, French Alps, with metamorphic and sedimentary rocks. For the three data sets, we find a good agreement between the observed volume distributions and a fit by a power-law distribution for volumes larger than 50 m3 , or 20 m3 for the Arly gorges. We obtain similar values of the b exponent close to 0.45 for the 3 data sets. In agreement with previous studies, this suggests, that the b value is not dependant on the geological settings. Regarding the rate of rock fall activity, determined as the number of rock fall events with volume larger than 1 m3 per year, we find a large variability from one site to the other. The rock fall activity, as part of a local erosion rate, is thus spatially dependent. We discuss the implications of these observations for the rock fall hazard evaluation. First, assuming that the volume distributions are temporally stable, a complete rock fall inventory allows for the prediction of recurrence rates for future events of a given volume in the range of the observed historical data. Second, assuming that the observed volume distribution follows a power-law distribution without cutoff at small or large scales, we can extrapolate these predictions to events smaller or larger than those reported in the data sets. Finally, we discuss the possible biases induced by the poor quality of the rock fall inventories, and the sensibility of the extrapolated predictions to variations in the parameters of the power law.

2020 ◽  
Vol 499 (1) ◽  
pp. 1385-1394
Author(s):  
Nived Vilangot Nhalil ◽  
Chris J Nelson ◽  
Mihalis Mathioudakis ◽  
J Gerry Doyle ◽  
Gavin Ramsay

ABSTRACT Numerous studies have analysed inferred power-law distributions between frequency and energy of impulsive events in the outer solar atmosphere in an attempt to understand the predominant energy supply mechanism in the corona. Here, we apply a burst detection algorithm to high-resolution imaging data obtained by the Interface Region Imaging Spectrograph to further investigate the derived power-law index, γ, of bright impulsive events in the transition region. Applying the algorithm with a constant minimum event lifetime (of either 60 s or 110 s) indicated that the target under investigation, such as Plage and Sunspot, has an influence on the observed power-law index. For regions dominated by sunspots, we always find γ < 2; however, for data sets where the target is a plage region, we often find that γ > 2 in the energy range (∼1023, ∼1026) erg. Applying the algorithm with a minimum event lifetime of three time-steps indicated that cadence was another important factor, with the highest cadence data sets returning γ > 2 values. The estimated total radiative power obtained for the observed energy distributions is typically 10–25 per cent of what would be required to sustain the corona indicating that impulsive events in this energy range are not sufficient to solve coronal heating. If we were to extend the power-law distribution down to an energy of 1021 erg, and assume parity between radiative energy release and the deposition of thermal energy, then such bursts could provide 25–50 per cent of the required energy to account for the coronal heating problem.


Author(s):  
Fatih Olmez ◽  
Peter R. Kramer ◽  
John Fricks ◽  
Deena R. Schmidt ◽  
Janet Best

2021 ◽  
Vol 2122 (1) ◽  
pp. 012006
Author(s):  
Daigo Umemoto ◽  
Nobuyasu Ito

Abstract Origin of a power-law in traffic-volume distribution found in traffic simulations of Kobe city was studied. The traffic distribution which was obtained from a shortest path search with randomized OD (origin-destination) set in Kobe city digital map obeys power-law. The toy model that Cayley tree is embedded in the network is also verified. It is theoretically shown that the traffic distribution with all possible OD set in a Cayley tree obeys power-law like distribution. With randomized OD set, the distribution is diffused from the theoretical point sets. Relationship between these facts and the origin of power-law is discussed.


Fractals ◽  
2009 ◽  
Vol 17 (03) ◽  
pp. 333-349 ◽  
Author(s):  
A. M. SELVAM

Dynamical systems in nature exhibit self-similar fractal fluctuations and the corresponding power spectra follow inverse power law form signifying long-range space-time correlations identified as self-organized criticality. The physics of self-organized criticality is not yet identified. The Gaussian probability distribution used widely for analysis and description of large data sets underestimates the probabilities of occurrence of extreme events such as stock market crashes, earthquakes, heavy rainfall, etc. The assumptions underlying the normal distribution such as fixed mean and standard deviation, independence of data, are not valid for real world fractal data sets exhibiting a scale-free power law distribution with fat tails. A general systems theory for fractals visualizes the emergence of successively larger scale fluctuations to result from the space-time integration of enclosed smaller scale fluctuations. The model predicts a universal inverse power law incorporating the golden mean for fractal fluctuations and for the corresponding power spectra, i.e., the variance spectrum represents the probabilities, a signature of quantum systems. Fractal fluctuations therefore exhibit quantum-like chaos. The model predicted inverse power law is very close to the Gaussian distribution for small-scale fluctuations, but exhibits a fat long tail for large-scale fluctuations. Extensive data sets of Dow Jones index, human DNA, Takifugu rubripes (Puffer fish) DNA are analyzed to show that the space/time data sets are close to the model predicted power law distribution.


2021 ◽  
Vol 7 (s2) ◽  
Author(s):  
Alexander Bergs

Abstract This paper focuses on the micro-analysis of historical data, which allows us to investigate language use across the lifetime of individual speakers. Certain concepts, such as social network analysis or communities of practice, put individual speakers and their social embeddedness and dynamicity at the center of attention. This means that intra-speaker variation can be described and analyzed in quite some detail in certain historical data sets. The paper presents some exemplary empirical analyses of the diachronic linguistic behavior of individual speakers/writers in fifteenth to seventeenth century England. It discusses the social factors that influence this behavior, with an emphasis on the methodological and theoretical challenges and opportunities when investigating intra-speaker variation and change.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Ghislain Romaric Meleu ◽  
Paulin Yonta Melatagia

AbstractUsing the headers of scientific papers, we have built multilayer networks of entities involved in research namely: authors, laboratories, and institutions. We have analyzed some properties of such networks built from data extracted from the HAL archives and found that the network at each layer is a small-world network with power law distribution. In order to simulate such co-publication network, we propose a multilayer network generation model based on the formation of cliques at each layer and the affiliation of each new node to the higher layers. The clique is built from new and existing nodes selected using preferential attachment. We also show that, the degree distribution of generated layers follows a power law. From the simulations of our model, we show that the generated multilayer networks reproduce the studied properties of co-publication networks.


Author(s):  
Cyprian Suchocki ◽  
Stanisław Jemioło

AbstractIn this work a number of selected, isotropic, invariant-based hyperelastic models are analyzed. The considered constitutive relations of hyperelasticity include the model by Gent (G) and its extension, the so-called generalized Gent model (GG), the exponential-power law model (Exp-PL) and the power law model (PL). The material parameters of the models under study have been identified for eight different experimental data sets. As it has been demonstrated, the much celebrated Gent’s model does not always allow to obtain an acceptable quality of the experimental data approximation. Furthermore, it is observed that the best curve fitting quality is usually achieved when the experimentally derived conditions that were proposed by Rivlin and Saunders are fulfilled. However, it is shown that the conditions by Rivlin and Saunders are in a contradiction with the mathematical requirements of stored energy polyconvexity. A polyconvex stored energy function is assumed in order to ensure the existence of solutions to a properly defined boundary value problem and to avoid non-physical material response. It is found that in the case of the analyzed hyperelastic models the application of polyconvexity conditions leads to only a slight decrease in the curve fitting quality. When the energy polyconvexity is assumed, the best experimental data approximation is usually obtained for the PL model. Among the non-polyconvex hyperelastic models, the best curve fitting results are most frequently achieved for the GG model. However, it is shown that both the G and the GG models are problematic due to the presence of the locking effect.


2021 ◽  
Author(s):  
David A Garcia ◽  
Gregory Fettweis ◽  
Diego M Presman ◽  
Ville Paakinaho ◽  
Christopher Jarzynski ◽  
...  

Abstract Single-molecule tracking (SMT) allows the study of transcription factor (TF) dynamics in the nucleus, giving important information regarding the diffusion and binding behavior of these proteins in the nuclear environment. Dwell time distributions obtained by SMT for most TFs appear to follow bi-exponential behavior. This has been ascribed to two discrete populations of TFs—one non-specifically bound to chromatin and another specifically bound to target sites, as implied by decades of biochemical studies. However, emerging studies suggest alternate models for dwell-time distributions, indicating the existence of more than two populations of TFs (multi-exponential distribution), or even the absence of discrete states altogether (power-law distribution). Here, we present an analytical pipeline to evaluate which model best explains SMT data. We find that a broad spectrum of TFs (including glucocorticoid receptor, oestrogen receptor, FOXA1, CTCF) follow a power-law distribution of dwell-times, blurring the temporal line between non-specific and specific binding, suggesting that productive binding may involve longer binding events than previously believed. From these observations, we propose a continuum of affinities model to explain TF dynamics, that is consistent with complex interactions of TFs with multiple nuclear domains as well as binding and searching on the chromatin template.


2015 ◽  
Vol 5 (1) ◽  
Author(s):  
Kai Zhao ◽  
Mirco Musolesi ◽  
Pan Hui ◽  
Weixiong Rao ◽  
Sasu Tarkoma

Sign in / Sign up

Export Citation Format

Share Document