The Regional Density Function and the Definition of Regional Boundaries

2014 ◽  
pp. 71-86
Author(s):  
John B. Parr ◽  
Darryl Holden
Author(s):  
William R. Wessels

This paper presents a design-for-reliability approach for mechanical design. Reliability analysis in part design, indeed the very definition of reliability, has been focused towards the electronic and digital disciplines since the emergence of reliability engineering in the late 1940’s. That focus dictates that parts fail in time; that all parts have a constant failure rate, and that part failure is modeled by the exponential mass density function. This paper presents current research that proposes that reliability in mechanical design is not characterized by ‘best practices’ reliability analyses. One premise investigated is that time does not cause failure in mechanical design; only failure mechanisms do. Mechanical parts experience wear-out and fatigue, unlike electronic and digital parts. Mechanical design analysis for part design investigates material strength properties required to survive failure mechanisms induced by part operation and by part exposure to external failure mechanisms. Such failure mechanisms include physical loads, thermal loads, and reactivity/corrosion. Each failure mechanism acting on a mechanical part induces one or more part failure modes, and each part failure mode has one or more failure effects on the part and the upper design configurations in which the part is integrated. The second premise investigated is that mechanical part failure is modeled by the Weibull mass density function in terms of stress, not time. A reliability math model for tensile strength in composite materials is presented to illustrate the two premises.


2008 ◽  
Vol 08 (03n04) ◽  
pp. L305-L314 ◽  
Author(s):  
J. GIESBRECHT

The impetus for investigating the probability density function of high-frequency (HF) noise arises from the requirement for a better noise model for automatic modulation recognition techniques. Many current modulation recognition methods still assume Gaussian noise models for the transmission medium. For HF communications this can be an incorrect assumption. Whereas a previous investigation [1] focuses on the noise density function in an urban area of Adelaide Australia, this work studies the noise density function in a remote country location east of Adelaide near Swan Reach, South Australia. Here, the definition of HF noise is primarily of natural origins – it is therefore impulsive – and excludes man-made noise sources. A new method for measuring HF noise is introduced that is used over a 153 kHz bandwidth at various frequencies across the HF band. The method excises man-made signals and calculates the noise PDF from the residue. Indeed, the suitability of the Bi-Kappa distribution at modeling HF noise is found to be even more compelling than suggested by the results of the earlier investigation.


1988 ◽  
Vol 31 (2) ◽  
pp. 271-283 ◽  
Author(s):  
Siegfried H. Lehnigk

We shall concern ourselves with the class of continuous, four-parameter, one-sided probability distributions which can be characterized by the probability density function (pdf) classIt depends on the four parameters: shift c ∈ R, scale b > 0, initial shape p < 1, and terminal shape β > 0. For p ≦ 0, the definition of f(x) can be completed by setting f(c) = β/bΓ(β−1)>0 if p = 0, and f(c) = 0 if p < 0. For 0 < p < 1, f(x) remains undefined at x = c; f(x)↑ + ∞ as x↓c.


Symmetry ◽  
2020 ◽  
Vol 12 (11) ◽  
pp. 1927
Author(s):  
Nachiketa Chakraborty

Stochastic variability is ubiquitous among astrophysical sources. Quantifying stochastic properties of observed time-series or lightcurves, can provide insights into the underlying physical mechanisms driving variability, especially those of the particles that radiate the observed emission. Toy models mimicking cosmic ray transport are particularly useful in providing a means of linking the statistical analyses of observed lightcurves to the physical properties and parameters. Here, we explore a very commonly observed feature; finite sized self-similarity or scale invariance which is a fundamental property of complex, dynamical systems. This is important to the general theme of physics and symmetry. We investigate it through the probability density function of time-varying fluxes arising from a Ornstein–Uhlenbeck Model, as this model provides an excellent description of several time-domain observations of sources like active galactic nuclei. The probability density function approach stems directly from the mathematical definition of self-similarity and is nonparametric. We show that the OU model provides an intuitive description of scale-limited self-similarity and stationary Gaussian distribution while potentially showing a way to link to the underlying cosmic ray transport. This finite size of the scale invariance depends upon the decay time in the OU model.


1974 ◽  
Vol 11 (4) ◽  
pp. 642-651 ◽  
Author(s):  
D. Jerwood

In this paper, the cost of the carrier-borne epidemic is considered. The definition of duration, as used by Weiss (1965) and subsequent authors, is generalised and the probability distribution for the number of located carriers is obtained. One component of cost, namely the area generated by the trajectory of carriers, is examined and its probability density function derived. The expected area generated is then shown to be proportional to the expected number of carriers located during the epidemic, a result which has an analogue in the general stochastic epidemic.


1969 ◽  
Vol 45 (2) ◽  
pp. 282 ◽  
Author(s):  
R. J. Greenfield ◽  
J. F. Lewis

2009 ◽  
Vol 42 (5) ◽  
pp. 783-792 ◽  
Author(s):  
A. Morawiec

Progress in experimental methods of serial sectioning and orientation determination opens new opportunities to study inter-crystalline boundaries in polycrystalline materials. In particular, macroscopic boundary parameters can now be measured automatically. With sufficiently large data sets, statistical analysis of interfaces between crystals is possible. The most basic and interesting issue is to find out the probability of occurrence of various boundaries in a given material. In order to define a boundary density function, a model of uniformity is needed. A number of such models can be conceived. It is proposed to use those derived from an assumed metric structure of the interface manifold. Some basic metrics on the manifold are explicitly given, and a number of notions and constructs needed for a strict definition of the boundary density function are considered. In particular, the crucial issue of the impact of symmetries is examined. The treatments of homo- and hetero-phase boundaries differ in some respects, and approaches applicable to each of these two cases are described. In order to make the abstract matter of the paper more accessible, a concrete boundary parameterization is used and some examples are given.


2001 ◽  
Vol 27 (9) ◽  
pp. 547-553
Author(s):  
Slawomir Dorosiewicz

This paper contains the definition of the extremum of integrable functions (e.g., the mode of density function). It seems to be a generalization of well-known standard definition and can be applied in estimation theory to extend the maximum likelihood method.


Sign in / Sign up

Export Citation Format

Share Document