probability density functions
Recently Published Documents


TOTAL DOCUMENTS

1115
(FIVE YEARS 178)

H-INDEX

50
(FIVE YEARS 5)

2022 ◽  
Author(s):  
Edgardo Cañón-Tapia

ABSTRACT Volcanic activity is ultimately controlled by processes that take place many kilometers beneath the surface of a planet. The deeper processes are unlikely to reach the surface without some degree of modification at shallower levels. Nevertheless, traces of those deeper processes may still be found when examining the final products at the surface. In this work, it is shown that it is possible to gain insights concerning the integrated contribution of deep structures through the study of the spatial distribution of volcanic vents at the surface. The method here described relies on the systematic use of increasing smoothing factors in Gaussian kernel estimations. The sequences of probability density functions thus generated are equivalent to images obtained with an increasing wavelength, which therefore have the power to penetrate deeper below the surface. Although the resolution of this method is much smaller than the resolution provided by seismic or other geophysical surveys, it has the advantages of ease of implementation, extremely low cost, and remote application. Thus, the reported method has great value as a first-order exploration tool to investigate the deep structure of a planet, and it can make important contributions to our understanding of the volcano-tectonic relationship, not only on Earth, but also across the various bodies of the solar system where volcanic activity has been documented.


2022 ◽  
Author(s):  
Christopher B. DuRoss ◽  
et al.

Text S1: Bayesian (OxCal) models for northern Lost River fault zone trench sites. Text S2: Bulk sediment analysis and charcoal identification; Text S3: Luminescence geochronology. Table S1: Description of stratigraphic units at the Sheep Creek trench. Table S2: Description of stratigraphic units at the Arentson Gulch trench. Figure S1: Photomosaics and large-format trench logs for the Sheep Creek trench. Figure S2: Photomosaics and large-format trench logs for the Arentson Gulch trench. Figure S3: Sheep Creek and Arentson Gulch vertical displacement measurements. Figure S4: Fault bend angles along the northern Lost River fault zone. Figure S5: Photographs of the Sheep Creek and Arentson Gulch trench sites. Figure S6: Probability density functions for Lost River fault zone ruptures.


2022 ◽  
Author(s):  
Christopher B. DuRoss ◽  
et al.

Text S1: Bayesian (OxCal) models for northern Lost River fault zone trench sites. Text S2: Bulk sediment analysis and charcoal identification; Text S3: Luminescence geochronology. Table S1: Description of stratigraphic units at the Sheep Creek trench. Table S2: Description of stratigraphic units at the Arentson Gulch trench. Figure S1: Photomosaics and large-format trench logs for the Sheep Creek trench. Figure S2: Photomosaics and large-format trench logs for the Arentson Gulch trench. Figure S3: Sheep Creek and Arentson Gulch vertical displacement measurements. Figure S4: Fault bend angles along the northern Lost River fault zone. Figure S5: Photographs of the Sheep Creek and Arentson Gulch trench sites. Figure S6: Probability density functions for Lost River fault zone ruptures.


Optics ◽  
2022 ◽  
Vol 3 (1) ◽  
pp. 19-34
Author(s):  
Milo W. Hyde ◽  
Olga Korotkova

Generalizing our prior work on scalar multi-Gaussian (MG) distributed optical fields, we introduce the two-dimensional instantaneous electric-field vector whose components are jointly MG distributed. We then derive the single-point Stokes parameter probability density functions (PDFs) of MG-distributed light having an arbitrary degree and state of polarization. We show, in particular, that the intensity contrast of such a field can be tuned to values smaller or larger than unity. We validate our analysis by generating an example partially polarized MG field with a specified single-point polarization matrix using two different Monte Carlo simulation methods. We then compute the joint PDFs of the instantaneous field components and the Stokes parameter PDFs from the simulated MG fields, while comparing the results of both Monte Carlo methods to the corresponding theory. Lastly, we discuss the strengths, weaknesses, and applicability of both simulation methods in generating MG fields.


2021 ◽  
Vol 1 (1) ◽  
Author(s):  
Ryszard SNOPKOWSKI ◽  
Marta SUKIENNIK ◽  
Aneta NAPIERAJ

The article presents selected issues in the field of stochastic simulation of production process-es. Attention was drawn to the possibilityof including, in this type of models, the risk accompanying the implementation of processes. Probability density functions that can beused to characterize random variables present in the model are presented. The possibility of making mistakes while creat-ing this typeof models was pointed out. Two selected examples of the use of stochastic simulation in the analysis of production processes on theexample of the mining process are presented.


Radiocarbon ◽  
2021 ◽  
pp. 1-22
Author(s):  
Nicholas V Kessler

ABSTRACT Age disparities between charcoal samples and their context are a well-known problem in archaeological chronometry, and even small offsets could affect the accuracy of high-precision wiggle-matched dates. In many cases of taphonomic or anthropogenic loss of the outermost rings, sapwood-based methods for estimating cutting dates are not always applicable, especially with charcoal. In these instances, wiggle-matched terminus post quem (TPQ) dates are often reconciled with subjective or ad hoc approaches. This study examines the distribution of age disparities caused by ring loss and other factors in a large dendroarchaeological dataset. Probability density functions describing the random distribution of age disparities are then fit to the empirical distributions. These functions are tested on an actual wiggle-matched non-cutting date from the literature to evaluate accuracy in a single case. Simulations are then presented to demonstrate how an age offset function can be applied in OxCal outlier models to yield accurate dating in archaeological sequences with short intervals between dated episodes, even if all samples are non-cutting dates.


Author(s):  
Kunio Takezawa

When data are found to be realizations of a specific distribution, constructing the probability density function based on this distribution may not lead to the best prediction result. In this study, numerical simulations are conducted using data that follow a normal distribution, and we examine whether probability density functions that have shapes different from that of the normal distribution can yield larger log-likelihoods than the normal distribution in the light of future data. The results indicate that fitting realizations of the normal distribution to a different probability density function produces better results from the perspective of predictive ability. Similarly, a set of simulations using the exponential distribution shows that better predictions are obtained when the corresponding realizations are fitted to a probability density function that is slightly different from the exponential distribution. These observations demonstrate that when the form of the probability density function that generates the data is known, the use of another form of the probability density function may achieve more desirable results from the standpoint of prediction.


Author(s):  
Qin Fan ◽  
Guo-Cheng Wu ◽  
Hui Fu

AbstractThe general fractional calculus becomes popular in continuous time random walk recently. However, the boundedness condition of the general fractional integral is one of the fundamental problems. It wasn’t given yet. In this short communication, the classical norm space is used, and a general boundedness theorem is presented. Finally, various long–tailed waiting time probability density functions are suggested in continuous time random walk since the general fractional integral is well defined.


Mathematics ◽  
2021 ◽  
Vol 9 (23) ◽  
pp. 3091
Author(s):  
Jelena Nikolić ◽  
Danijela Aleksić ◽  
Zoran Perić ◽  
Milan Dinčić

Motivated by the fact that uniform quantization is not suitable for signals having non-uniform probability density functions (pdfs), as the Laplacian pdf is, in this paper we have divided the support region of the quantizer into two disjunctive regions and utilized the simplest uniform quantization with equal bit-rates within both regions. In particular, we assumed a narrow central granular region (CGR) covering the peak of the Laplacian pdf and a wider peripheral granular region (PGR) where the pdf is predominantly tailed. We performed optimization of the widths of CGR and PGR via distortion optimization per border–clipping threshold scaling ratio which resulted in an iterative formula enabling the parametrization of our piecewise uniform quantizer (PWUQ). For medium and high bit-rates, we demonstrated the convenience of our PWUQ over the uniform quantizer, paying special attention to the case where 99.99% of the signal amplitudes belong to the support region or clipping region. We believe that the resulting formulas for PWUQ design and performance assessment are greatly beneficial in neural networks where weights and activations are typically modelled by the Laplacian distribution, and where uniform quantization is commonly used to decrease memory footprint.


Sign in / Sign up

Export Citation Format

Share Document