scholarly journals Testing the Validity of a Link Function Assumption in Repeated Type-II Censored General Step-Stress Experiments

Sankhya B ◽  
2021 ◽  
Author(s):  
Stefan Bedbur ◽  
Thomas Seiche

AbstractIn step-stress experiments, test units are successively exposed to higher usually increasing levels of stress to cause earlier failures and to shorten the duration of the experiment. When parameters are associated with the stress levels, one problem is to estimate the parameter corresponding to normal operating conditions based on failure data obtained under higher stress levels. For this purpose, a link function connecting parameters and stress levels is usually assumed, the validity of which is often at the discretion of the experimenter. In a general step-stress model based on multiple samples of sequential order statistics, we provide exact statistical tests to decide whether the assumption of some link function is adequate. The null hypothesis of a proportional, linear, power or log-linear link function is considered in detail, and associated inferential results are stated. In any case, except for the linear link function, the test statistics derived are shown to have only one distribution under the null hypothesis, which simplifies the computation of (exact) critical values. Asymptotic results are addressed, and a power study is performed for testing on a log-linear link function. Some improvements of the tests in terms of power are discussed.

Mathematics ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 817
Author(s):  
Fernando López ◽  
Mariano Matilla-García ◽  
Jesús Mur ◽  
Manuel Ruiz Marín

A novel general method for constructing nonparametric hypotheses tests based on the field of symbolic analysis is introduced in this paper. Several existing tests based on symbolic entropy that have been used for testing central hypotheses in several branches of science (particularly in economics and statistics) are particular cases of this general approach. This family of symbolic tests uses few assumptions, which increases the general applicability of any symbolic-based test. Additionally, as a theoretical application of this method, we construct and put forward four new statistics to test for the null hypothesis of spatiotemporal independence. There are very few tests in the specialized literature in this regard. The new tests were evaluated with the mean of several Monte Carlo experiments. The results highlight the outstanding performance of the proposed test.


Electronics ◽  
2021 ◽  
Vol 10 (5) ◽  
pp. 581
Author(s):  
Yongbae Kim ◽  
Juyong Back ◽  
Jongweon Kim

A tachograph in a vehicle records the vehicle operating conditions, such as speed, distance, brake operation conditions, acceleration, GPS information, etc., in intervals of one second. For accidents, the tachograph records information, such as the acceleration and direction of a vehicle traveling in intervals of 1/100 s for 10 s before and after the accident occurs as collision data. A vehicle equipped with a tachograph is obliged to upload operation data to administrative organizations periodically via other auxiliary storage devices like a USB attached external memory or online wireless communication. If there is a problem with the recorded contents, data may be at risk of being tampered with during the uploading process. This research proposed tamper-resistant technology based on blockchain for data in online and offline environments. The suggested algorithm proposed a new data recording mechanism that operates in low-level hardware of digital tachographs for tamper-resistance in light blockchains and on/offline situations. The average encoding time of the proposed light blockchain was 1.85 ms/Mb, while the average decoding time was 1.65 ms/Mb. With the outliers in statistical tests removed, the estimated average encoding and decoding time was 1.32 ms/Mb and 1.29 ms/Mb, respectively, and the tamper verification test detected all the tampered data.


1989 ◽  
Vol 7 (4) ◽  
pp. 267-270 ◽  
Author(s):  
Douglas G Bonett ◽  
P.M Bentler ◽  
J.Arthur Woodward

2020 ◽  
Author(s):  
Pieter-Jan Daems ◽  
Y. Guo ◽  
S. Sheng ◽  
C. Peeters ◽  
P. Guillaume ◽  
...  

Abstract Wind energy is one of the largest sources of renewable energy in the world. To further reduce the operations and maintenance (O&M) costs of wind farms, it is essential to be able to accurately pinpoint the root causes of different failure modes of interest. An example of such a failure mode that is not yet fully understood is white etching cracks (WEC). This can cause the bearing lifetime to be reduced to 5–10% of its design value. Multiple hypotheses are available in literature concerning its cause. To be able to validate or disprove these hypotheses, it is essential to have historic high-frequency measurement data (e.g., load and vibration levels) available. In time, this will allow linking to the history of the turbine operating data with failure data. This paper discusses the dynamic loading on the turbine during certain events (e.g., emergency stops, run-ups, and during normal operating conditions). By combining the number of specific events that each turbine has seen with the severity of each event, it becomes possible to assess which turbines are most likely to show signs of damage.


2021 ◽  
Author(s):  
Chenyang Bi ◽  
Jordan E. Krechmer ◽  
Manjula R. Canagaratna ◽  
Gabriel Isaacman-VanWertz

Abstract. Quantitative calibration of analytes using chemical ionization mass spectrometers (CIMS) has been hindered by the lack of commercially available standards of atmospheric oxidation products. To accurately calibrate analytes without standards, techniques have been recently developed to log-linearly correlate analyte sensitivity with instrument operating conditions. However, there is an inherent bias when applying log-linear calibration relationships that is typically ignored. In this study, we examine the bias in a log-linear based calibration curve based on prior mathematical work. We quantify the potential bias within the context of a CIMS-relevant relationship between analyte sensitivity and instrument voltage differentials. Uncertainty in three parameters has the potential to contribute to the bias, specifically the inherent extent to which the nominal relationship can capture true sensitivity, the slope of the relationship, and the voltage differential below which maximum sensitivity is achieved. Using a prior published case study, we estimate an average bias of 30%, with one order of magnitude for less sensitive compounds in some circumstances. A parameter-explicit solution is proposed in this work for completely removing the inherent bias generated in the log-linear calibration relationships. A simplified correction method is also suggested for cases where a comprehensive bias correction is not possible due to unknown uncertainties of calibration parameters, which is shown to eliminate the bias on average but not for each individual compound.


2021 ◽  
Vol 14 (10) ◽  
pp. 6551-6560
Author(s):  
Chenyang Bi ◽  
Jordan E. Krechmer ◽  
Manjula R. Canagaratna ◽  
Gabriel Isaacman-VanWertz

Abstract. Quantitative calibration of analytes using chemical ionization mass spectrometers (CIMSs) has been hindered by the lack of commercially available standards of atmospheric oxidation products. To accurately calibrate analytes without standards, techniques have been recently developed to log-linearly correlate analyte sensitivity with instrument operating conditions. However, there is an inherent bias when applying log-linear calibration relationships that is typically ignored. In this study, we examine the bias in a log-linear-based calibration curve based on prior mathematical work. We quantify the potential bias within the context of a CIMS-relevant relationship between analyte sensitivity and instrument voltage differentials. Uncertainty in three parameters has the potential to contribute to the bias, specifically the inherent extent to which the nominal relationship can capture true sensitivity, the slope of the relationship, and the voltage differential below which maximum sensitivity is achieved. Using a prior published case study, we estimate an average bias of 30 %, with 1 order of magnitude for less sensitive compounds in some circumstances. A parameter-explicit solution is proposed in this work for completely removing the inherent bias generated in the log-linear calibration relationships. A simplified correction method is also suggested for cases where a comprehensive bias correction is not possible due to unknown uncertainties of calibration parameters, which is shown to eliminate the bias on average but not for each individual compound.


2019 ◽  
Vol 4 (1) ◽  
pp. 141-156
Author(s):  
Bradley Lail ◽  
Robert C. Lipe ◽  
Han S. Yi

Our paper examines inconsistent conclusions regarding the accrual anomaly and demonstrates the importance of aligning regression specifications with hypotheses. Richardson, Sloan, Soliman, and Tuna (2005) conclude that accruals are mispriced and the mispricing seems to increase as accrual reliability decreases. Barone and Magilke (2009) and Ball, Gerakos, Linnainmaa, and Nikolaev (2016) conclude that cash flows rather than accruals are mispriced. We show that the divergent conclusions come from misalignment between the null hypothesis and regression specification in Richardson et al. (2005) . In addition, analysis of the contemporaneous relations between stock returns and components of earnings supports an initial underreaction to cash flows by investors. We fail to detect links between the reliability measures in Richardson et al. (2005) and investor behavior once we align the statistical tests with the null hypothesis. Our reexamination of prior findings benefits accounting academics, standard setters, and others interested in how investors use earnings components. JEL Classifications: M41. Data Availability: All data used in this study are publicly available from the sources identified in the text.


1992 ◽  
Vol 13 (9) ◽  
pp. 553-555 ◽  
Author(s):  
Leon F. Burmeister ◽  
David Bimbaum ◽  
Samuel B. Sheps

A variety of statistical tests of a null hypothesis commonly are used in biomedical studies. While these tests are the mainstay for justifying inferences drawn from data, they have important limitations. This report discusses the relative merits of two different approaches to data analysis and display, and recommends the use of confidence intervals rather than classic hypothesis testing.Formulae for a confidence interval surrounding the point estimate of an average value take the form: d= ±zσ/√n, where “d” represents the average difference between central and extreme values, “z” is derived from the density function of a known distribution, and “a/-∨n” represents the magnitude of sampling variability. Transposition of terms yields the familiar formula for hypothesis testing of normally distributed data (without applying the finite population correction factor): z = d/(σ/√n).


2006 ◽  
Vol 43 (04) ◽  
pp. 1137-1154 ◽  
Author(s):  
Michael V. Boutsikas ◽  
Markos V. Koutras

The discrete scan statistic in a binary (0-1) sequence of n trials is defined as the maximum number of successes within any k consecutive trials (n and k, n ≥ k, being two positive integers). It has been used in many areas of science (quality control, molecular biology, psychology, etc.) to test the null hypothesis of uniformity against a clustering alternative. In this article we provide a compound Poisson approximation and subsequently use it to establish asymptotic results for the distribution of the discrete scan statistic as n, k → ∞ and the success probability of the trials is kept fixed. An extreme value theorem is also provided for the celebrated Erdős-Rényi statistic.


Processes ◽  
2020 ◽  
Vol 8 (9) ◽  
pp. 1084
Author(s):  
Chuanqi Lu ◽  
Zhi Zheng ◽  
Shaoping Wang

Axial piston pumps are crucial for the safe operation of hydraulic systems and usually work under variable operating conditions. However, deterioration status recognition for such pumps under variable conditions has rarely been reported until now. Therefore, it is valuable to develop effective methods suitable for processing variable conditions. Firstly, considering that information entropy has strong robustness to variable conditions and empirical mode decomposition (EMD) has the advantages of processing nonlinear and nonstationary signals, a new degradation feature parameter, named local instantaneous energy moment entropy, which combines information entropy theory and EMD, is proposed in this paper. To obtain more accurate degradation feature, a waveform matching extrema mirror extension EMD, which is used to suppress the end effects of EMD decomposition, was employed to decompose the original pump’s outlet pressure signals, taking the quasi-periodic characteristics of the signals into consideration. Subsequently, given that different failure modes of pumps have different degradation rates in practice, which makes it difficult to effectively recognize degradation status when using the modeling methods that need the normal and failure data, a Gaussian mixture model (GMM), which has no need for failure data when building a degradation identification model, was introduced to capture the new degradation status index (DSI) to quantitatively assess the degradation state of the pumps. Finally, the effectiveness of the proposed approach was validated using both simulations and experiments. It was demonstrated that the defined local instantaneous energy moment entropy is able to effectively characterize the degree of degradation of the pumps under variable operating conditions, and the DSI derived from the GMM is able to accurately identify different degradation states when compared with the previously published methods.


Sign in / Sign up

Export Citation Format

Share Document