Estimation of probability density function of long-term strain measurement for reliability assessment

Author(s):  
K Chen ◽  
J Ko ◽  
X Hua ◽  
Y Ni
1972 ◽  
Vol 16 (02) ◽  
pp. 113-123
Author(s):  
Alaa Mansour

Methods for predicting the probability of failure under extreme values of bending moment (primary loading only) are developed. In order to obtain an accurate estimate of the extreme values of the bending moment, order statistics are used. The wave bending moment amplitude treated as a random variable is considered to follow, in general, Weibull distribution so that the results could be used for short-term as well as long-term analysis. The probability density function of the extreme values of the wave bending moment is obtained and an estimate is made of the most probable value (that is, the mode) and other relevant statistics. The probability of exceeding a given value of wave bending moment in "n" records and during the operational lifetime of the ship is derived. Using this information, the probability of failure is obtained on the basis of an assumed normal probability density function of the resistive strength and deterministic still-water bending moment. Charts showing the relation of the parameters in a nondimensional form are presented. Examples of the use of the charts for long-term and short-term analysis for predicting extreme values of wave bending moment and the corresponding probability of failure are given.


2001 ◽  
Vol 11 (07) ◽  
pp. 2007-2018 ◽  
Author(s):  
PAWEŁ GÓRA ◽  
ABRAHAM BOYARSKY

The problem of controlling a chaotic system is treated from a long term statistical basis. Unlike the OGY targeting method that exploits individual unstable orbits, this approach is concerned with targeting the density function of an invariant probability measure. Given a point transformation T, possessing a density function f, we choose [Formula: see text], a different probability density function, to be the target. Using optimization methods, we construct a point transformation [Formula: see text], close to T, whose invariant probability density function is [Formula: see text], or close to [Formula: see text].


2012 ◽  
Vol 1 (33) ◽  
pp. 127 ◽  
Author(s):  
Deborah Ann Villarroel-Lamb

A recently developed beach change model was investigated to assess its predictive capability with respect to shoreline change. This investigation formed part of a number of analyses being conducted to assess the capability of the numerical model. The model was firstly compared to a commonly used commercial model to assess its output on wave and sediment responses. Secondly, the beach changes were investigated to determine a likely probability density function for the shoreline responses. A number of probability density functions were compared with the results and critical deductions were made. Lastly, the new beach change model has a distinctive feature which attempts to reduce the model run-time to promote greater use. This wave-averaging feature was investigated to determine model performance as parameters were changed. It was shown that the model compares favorably to the commercial package in some aspects, but not all. The shoreline response may be best described by a single probability density function, which makes it quite suitable for quantitative risk analyses. Lastly, the wave-averaging feature can be used to reduce runtime although this requires the user to apply sound judgment in the analyses.


2018 ◽  
Vol 611 ◽  
pp. A53 ◽  
Author(s):  
S. Jamal ◽  
V. Le Brun ◽  
O. Le Fèvre ◽  
D. Vibert ◽  
A. Schmitt ◽  
...  

Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate.Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function.Methods. We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms.Results. As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy ~58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy ~98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels.Conclusions. Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for spectroscopic redshift measurements. This newly-defined method is very promising for next-generation large spectroscopic surveys from the ground and in space, such as Euclid and WFIRST.


2016 ◽  
Vol 2016 ◽  
pp. 1-9 ◽  
Author(s):  
Branimir Jaksic ◽  
Mihajlo Stefanovic ◽  
Danijela Aleksic ◽  
Dragan Radenkovic ◽  
Sinisa Minic

Macrodiversity system with macrodiversity SC receiver and three microdiversity MRC (maximum ratio combining) receivers is considered. Independent k-μ short-term fading and correlated Gamma long-term fading are present at the inputs of microdiversity MRC receivers. For this model, the probability density function and the cumulative density function of microdiversity MRC receivers and macrodiversity SC receiver output signal envelopes are calculated. Influences of Gamma shadowing severity, k-μ multipath fading severity, Rician factor and correlation coefficient at probability density function, and cumulative density function of macrodiversity SC receiver output signal envelopes are graphically presented.


1990 ◽  
Vol 122 ◽  
pp. 13-23
Author(s):  
A. Bianchini

AbstractQuiescent novae are more stable against mass transfer rate than dwarf novae. They may however show cyclical variations of their quiescent magnitudes on time scales of years, probably caused by solar–type cycles of activity of the secondary. The probability density function of the periods of the cycles observed in CVs is similar to that for single stars. Sometimes, periodic or quasi periodic light variations on time scales of tens to hundreds of days are also observed. Although the magnitudes of prenovae and postnovae are essentially the same, the definition of the magnitude of a quiescent nova is still uncertain. At present, the hibernation theory for old novae seems to be supported only by the observations of two very old novae.


Sign in / Sign up

Export Citation Format

Share Document