scholarly journals Variational AutoEncoder to Identify Anomalous Data in Robots

Robotics ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 93
Author(s):  
Luigi Pangione ◽  
Guy Burroughes ◽  
Robert Skilton

For robotic systems involved in challenging environments, it is crucial to be able to identify faults as early as possible. In challenging environments, it is not always possible to explore all of the fault space, thus anomalous data can act as a broader surrogate, where an anomaly may represent a fault or a predecessor to a fault. This paper proposes a method for identifying anomalous data from a robot, whilst using minimal nominal data for training. A Monte Carlo ensemble sampled Variational AutoEncoder was utilised to determine nominal and anomalous data through reconstructing live data. This was tested on simulated anomalies of real data, demonstrating that the technique is capable of reliably identifying an anomaly without any previous knowledge of the system. With the proposed system, we obtained an F1-score of 0.85 through testing.

Author(s):  
Lingtao Kong

The exponential distribution has been widely used in engineering, social and biological sciences. In this paper, we propose a new goodness-of-fit test for fuzzy exponentiality using α-pessimistic value. The test statistics is established based on Kullback-Leibler information. By using Monte Carlo method, we obtain the empirical critical points of the test statistic at four different significant levels. To evaluate the performance of the proposed test, we compare it with four commonly used tests through some simulations. Experimental studies show that the proposed test has higher power than other tests in most cases. In particular, for the uniform and linear failure rate alternatives, our method has the best performance. A real data example is investigated to show the application of our test.


2012 ◽  
Vol 53 ◽  
Author(s):  
Gintautas Jakimauskas ◽  
Leonidas Sakalauskas

The efficiency of adding an auxiliary regression variable to the logit model in estimation of small probabilities in large populations is considered. Let us consider two models of distribution of unknown probabilities: the probabilities have gamma distribution (model (A)), or logits of the probabilities have Gaussian distribution (model (B)). In modification of model (B) we will use additional regression variable for Gaussian mean (model (BR)). We have selected real data from Database of Indicators of Statistics Lithuania – Working-age persons recognized as disabled for the first time by administrative territory, year 2010 (number of populations K = 60). Additionally, we have used average annual population data by administrative territory. The auxiliary regression variable was based on data – Number of hospital discharges by administrative territory, year 2010. We obtained initial parameters using simple iterative procedures for models (A), (B) and (BR). At the second stage we performed various tests using Monte-Carlo simulation (using models (A), (B) and (BR)). The main goal was to select an appropriate model and to propose some recommendations for using gamma and logit (with or without auxiliary regression variable) models for Bayesian estimation. The results show that a Monte Carlo simulation method enables us to determine which estimation model is preferable.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Hisham M. Almongy ◽  
Ehab M. Almetwally ◽  
Randa Alharbi ◽  
Dalia Alnagar ◽  
E. H. Hafez ◽  
...  

This paper is concerned with the estimation of the Weibull generalized exponential distribution (WGED) parameters based on the adaptive Type-II progressive (ATIIP) censored sample. Maximum likelihood estimation (MLE), maximum product spacing (MPS), and Bayesian estimation based on Markov chain Monte Carlo (MCMC) methods have been determined to find the best estimation method. The Monte Carlo simulation is used to compare the three methods of estimation based on the ATIIP-censored sample, and also, we made a bootstrap confidence interval estimation. We will analyze data related to the distribution about single carbon fiber and electrical data as real data cases to show how the schemes work in practice.


2020 ◽  
Vol 9 (1) ◽  
pp. 47-60
Author(s):  
Samir K. Ashour ◽  
Ahmed A. El-Sheikh ◽  
Ahmed Elshahhat

In this paper, the Bayesian and non-Bayesian estimation of a two-parameter Weibull lifetime model in presence of progressive first-failure censored data with binomial random removals are considered. Based on the s-normal approximation to the asymptotic distribution of maximum likelihood estimators, two-sided approximate confidence intervals for the unknown parameters are constructed. Using gamma conjugate priors, several Bayes estimates and associated credible intervals are obtained relative to the squared error loss function. Proposed estimators cannot be expressed in closed forms and can be evaluated numerically by some suitable iterative procedure. A Bayesian approach is developed using Markov chain Monte Carlo techniques to generate samples from the posterior distributions and in turn computing the Bayes estimates and associated credible intervals. To analyze the performance of the proposed estimators, a Monte Carlo simulation study is conducted. Finally, a real data set is discussed for illustration purposes.


Author(s):  
Raúl O. Fernández ◽  
J. Eduardo Vera-Valdés

This chapter shows a way to, using simulation analysis, assess the performance of some of the most popular unit root and change in persistence tests. The authors do this by means of Monte Carlo simulations. The findings suggest that these tests show a lower than expected performance when dealing with some of the processes commonly believed to be found in the economic and financial data. The output signals that extreme care should be taken when trying to support a theory using real data. As the results show, a blind practitioner could get misleading implications almost surely. As an empirical exercise, the authors show that the considered test finds evidence of a unit root process in the US house price index. Nonetheless, as the simulation analysis shows, extreme caution should be taken when analyzing these results.


2020 ◽  
Vol 2020 ◽  
pp. 1-14
Author(s):  
Mohammed Obeidat ◽  
Amjad Al-Nasser ◽  
Amer I. Al-Omari

This paper studies estimation of the parameters of the generalized Gompertz distribution based on ranked-set sample (RSS). Maximum likelihood (ML) and Bayesian approaches are considered. Approximate confidence intervals for the unknown parameters are constructed using both the normal approximation to the asymptotic distribution of the ML estimators and bootstrapping methods. Bayes estimates and credible intervals of the unknown parameters are obtained using differential evolution Markov chain Monte Carlo and Lindley’s methods. The proposed methods are compared via Monte Carlo simulations studies and an example employing real data. The performance of both ML and Bayes estimates is improved under RSS compared with simple random sample (SRS) regardless of the sample size. Bayes estimates outperform the ML estimates for small samples, while it is the other way around for moderate and large samples.


2015 ◽  
Vol 8 (4) ◽  
pp. 3471-3523 ◽  
Author(s):  
J. C. Corbin ◽  
A. Othman ◽  
J. D. Haskins ◽  
J. D. Allan ◽  
B. Sierau ◽  
...  

Abstract. The errors inherent in the fitting and integration of the pseudo-Gaussian ion peaks in Aerodyne High-Resolution Aerosol Mass Spectrometers (HR-AMS's) have not been previously addressed as a source of imprecision for these instruments. This manuscript evaluates the significance of these uncertainties and proposes a method for their estimation in routine data analysis. Peak-fitting uncertainties, the most complex source of integration uncertainties, are found to be dominated by errors in m/z calibration. These calibration errors comprise significant amounts of both imprecision and bias, and vary in magnitude from ion to ion. The magnitude of these m/z calibration errors is estimated for an exemplary data set, and used to construct a Monte Carlo model which reproduced well the observed trends in fits to the real data. The empirically-constrained model is used to show that the imprecision in the fitted height of isolated peaks scales linearly with the peak height (i.e., as n1), thus contributing a constant-relative-imprecision term to the overall uncertainty. This constant relative imprecision term dominates the Poisson counting imprecision term (which scales as n0.5) at high signals. The previous HR-AMS uncertainty model therefore underestimates the overall fitting imprecision. The constant relative imprecision in fitted peak height for isolated peaks in the exemplary data set was estimated as ~4% and the overall peak-integration imprecision was approximately 5%. We illustrate the importance of this constant relative imprecision term by performing Positive Matrix Factorization (PMF) on a~synthetic HR-AMS data set with and without its inclusion. Finally, the ability of an empirically-constrained Monte Carlo approach to estimate the fitting imprecision for an arbitrary number of known overlapping peaks is demonstrated. Software is available upon request to estimate these error terms in new data sets.


Author(s):  
Hiba Zeyada Muhammed ◽  
Essam Abd Elsalam Muhammed

In this paper, Bayesian and non-Bayesian estimation of the inverted Topp-Leone distribution shape parameter are studied when the sample is complete and random censored. The maximum likelihood estimator (MLE) and Bayes estimator of the unknown parameter are proposed. The Bayes estimates (BEs) have been computed based on the squared error loss (SEL) function and using Markov Chain Monte Carlo (MCMC) techniques. The asymptotic, bootstrap (p,t), and highest posterior density intervals are computed. The Metropolis Hasting algorithm is proposed for Bayes estimates. Monte Carlo simulation is performed to compare the performances of the proposed methods and one real data set has been analyzed for illustrative purposes.


Author(s):  
Lc Granadi Suhaidir ◽  
S Sumijan ◽  
Yuhandri Yunus

Kerinci Regency which was established on November 10, 1957 from the results of the division of 3 provinces, namely West Sumatra Province, Riau Province, Jambi Province. The district which is nicknamed the City of Sakti Alam Kerinci has a population of 253,258 people with an area of ​​3,808 km and consists of 16 sub-districts. So that training, technology, and improving Maunisa Resources are needed in various aspects of Kerinci society. Determine the level of accuracy of the Monte Carlo method simulation between the simulation results and the real data. In this study, the main data used were data for 2017, 2018 and 2019. The variable used in this study was the frequency of student scores in participating in learning. The value data will be processed using the Monte Carlo method assisted by Microsoft Excel for manual search. Student grade data for 2017 is used as trial data to predict in 2018, data for 2018 is used as trial data to predict the number of 2019, and data for 2019 will be used to predict the number in 2020 later. Where the highest prediction result is 96% where there are several competencies that have the same value. So that the average resulting from the predicted accuracy is 95% of the 7 competencies. The test results have clearly formed the boundaries. With an accuracy rate of 95%, it can be recommended to help the UPTD Kerinci District Work Training Center in predicting the level of understanding of students.


2021 ◽  
Author(s):  
Rofeide Jabbari

In this thesis we study and analyze the pricing of barrier and barrier crack options under a Time-Changed Levy process. Oil and gasoline in Canada are our underlying commodities of interest in this study. To characterize the dynamics of oil and gasoline prices, Black-Scholes and Time-Changed models based on Levy process are proposed. To verify the model, real data of the Canada oil and gas market is used. While the pricing methods based on Monte Carlo are the well-known and dominant for price calculation, we propose a Fourier Transform (FT) for the pricing, which provide some important advantages to the Monte Carlo method such as computation speed without compromising any accuracy. The method is also applied to Crack spread contracts to reduce the risk.


Sign in / Sign up

Export Citation Format

Share Document