scholarly journals Predictive Maintenance Scheduling with Failure Rate Described by Truncated Normal Distribution

Sensors ◽  
2020 ◽  
Vol 20 (23) ◽  
pp. 6787
Author(s):  
Iwona Paprocka ◽  
Wojciech M. Kempa ◽  
Grzegorz Ćwikła

The method of risk assessment and planning of technical inspections of machines and optimization of production tasks is the main focus of this study. Any unpredicted failure resulted in the production plans no longer being valid, production processes needing to be rescheduled, costs of unused machine production capacity and losses due to the production of poor-quality products increase, as well as additional costs of human resources, equipment, and materials used during the maintenance. The method reflects the operation of the production system and the nature of the disturbances, allowing for the estimation of unknown parameters related to machine reliability. The machine failure frequency was described with the normal distribution truncated to the positive half of the axis. In production practice, this distribution is commonly used to describe the phenomenon of irregularities. The presented method was an extension of the Six Sigma concept for monitoring and continuous control in order to eliminate and prevent various inconsistencies in processes and resulting products. Reliability characteristics were used to develop predictive schedules. Schedules were assessed using the criteria of solution and quality robustness. Estimation methods of parameters describing disturbances were compared for different job shop scheduling problems. The estimation method based on a maximum likelihood approach allowed for more accurate prediction of scheduling problems. The paper presents a practical example of the application of the proposed method for electric steering gears.

Symmetry ◽  
2019 ◽  
Vol 11 (2) ◽  
pp. 165 ◽  
Author(s):  
Arun Sangaiah ◽  
Mohsen Suraki ◽  
Mehdi Sadeghilalimi ◽  
Seyed Bozorgi ◽  
Ali Hosseinabadi ◽  
...  

In a real manufacturing environment, the set of tasks that should be scheduled is changing over the time, which means that scheduling problems are dynamic. Also, in order to adapt the manufacturing systems with fluctuations, such as machine failure and create bottleneck machines, various flexibilities are considered in this system. For the first time, in this research, we consider the operational flexibility and flexibility due to Parallel Machines (PM) with non-uniform speed in Dynamic Job Shop (DJS) and in the field of Flexible Dynamic Job-Shop with Parallel Machines (FDJSPM) model. After modeling the problem, an algorithm based on the principles of Genetic Algorithm (GA) with dynamic two-dimensional chromosomes is proposed. The results of proposed algorithm and comparison with meta-heuristic data in the literature indicate the improvement of solutions by 1.34 percent for different dimensions of the problem.


Mathematics ◽  
2020 ◽  
Vol 8 (9) ◽  
pp. 1578 ◽  
Author(s):  
Hazem Al-Mofleh ◽  
Ahmed Z. Afify ◽  
Noor Akma Ibrahim

In this paper, a new two-parameter generalized Ramos–Louzada distribution is proposed. The proposed model provides more flexibility in modeling data with increasing, decreasing, J-shaped, and reversed-J shaped hazard rate functions. Several statistical properties of the model were derived. The unknown parameters of the new distribution were explored using eight frequentist estimation approaches. These approaches are important for developing guidelines to choose the best method of estimation for the model parameters, which would be of great interest to practitioners and applied statisticians. Detailed numerical simulations are presented to examine the bias and the mean square error of the proposed estimators. The best estimation method and ordering performance of the estimators were determined using the partial and overall ranks of all estimation methods for various parameter combinations. The performance of the proposed distribution is illustrated using two real datasets from the fields of medicine and geology, and both datasets show that the new model is more appropriate as compared to the Marshall–Olkin exponential, exponentiated exponential, beta exponential, gamma, Poisson–Lomax, Lindley geometric, generalized Lindley, and Lindley distributions, among others.


Mathematics ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 49
Author(s):  
Siqi Chen ◽  
Wenhao Gui

In reality, estimations for the unknown parameters of truncated distribution with censored data have wide utilization. Truncated normal distribution is more suitable to fit lifetime data compared with normal distribution. This article makes statistical inferences on estimating parameters under truncated normal distribution using adaptive progressive type II censored data. First, the estimates are calculated through exploiting maximum likelihood method. The observed and expected Fisher information matrices are derived to establish the asymptotic confidence intervals. Second, Bayesian estimations under three loss functions are also studied. The point estimates are calculated by Lindley approximation. Importance sampling technique is applied to discuss the Bayes estimates and build the associated highest posterior density credible intervals. Bootstrap confidence intervals are constructed for the purpose of comparison. Monte Carlo simulations and data analysis are employed to present the performances of various methods. Finally, we obtain optimal censoring schemes under different criteria.


2021 ◽  
Author(s):  
Patrick Rodler ◽  
Erich Teppan ◽  
Dietmar Jannach

Optimal production planning in the form of job shop scheduling problems (JSSP) is a vital problem in many industries. In practice, however, it can happen that the volume of jobs (orders) exceeds the production capacity for a given planning horizon. A reasonable aim in such situations is the completion of as many jobs as possible in time (while postponing the rest). We call this the Job Set Optimization Problem (JOP). Technically, when constraint programming is used for solving JSSPs, the formulated objective in the constraint model can be adapted so that the constraint solver addresses JOP, i.e., searches for schedules that maximize the number of timely finished jobs. However, also highly specialized solvers which proved very powerful for JSSPs may struggle with the increased complexity of the reformulated problem and may fail to generate a JOP solution given practical computation timeouts. As a remedy, we suggest a framework for solving multiple randomly modified instances of a relaxation of the JOP, which allows to gradually approach a JOP solution. The main idea is to have one module compute subset-minimal job sets to be postponed, and another one effectuating that random job sets are found. Different algorithms from literature can be used to realize these modules. Using IBM’s cutting-edge CP Optimizer suite, experiments on well-known JSSP benchmark problems show that using the proposed framework consistently leads to more scheduled jobs for various computation timeouts than a standalone constraint solver approach.


Entropy ◽  
2021 ◽  
Vol 23 (2) ◽  
pp. 186
Author(s):  
Xinyi Zeng ◽  
Wenhao Gui

In this paper, the parameter estimation problem of a truncated normal distribution is discussed based on the generalized progressive hybrid censored data. The desired maximum likelihood estimates of unknown quantities are firstly derived through the Newton–Raphson algorithm and the expectation maximization algorithm. Based on the asymptotic normality of the maximum likelihood estimators, we develop the asymptotic confidence intervals. The percentile bootstrap method is also employed in the case of the small sample size. Further, the Bayes estimates are evaluated under various loss functions like squared error, general entropy, and linex loss functions. Tierney and Kadane approximation, as well as the importance sampling approach, is applied to obtain the Bayesian estimates under proper prior distributions. The associated Bayesian credible intervals are constructed in the meantime. Extensive numerical simulations are implemented to compare the performance of different estimation methods. Finally, an authentic example is analyzed to illustrate the inference approaches.


2020 ◽  
Vol 2020 (66) ◽  
pp. 101-110
Author(s):  
. Azhar Kadhim Jbarah ◽  
Prof Dr. Ahmed Shaker Mohammed

The research is concerned with estimating the effect of the cultivated area of barley crop on the production of that crop by estimating the regression model representing the relationship of these two variables. The results of the tests indicated that the time series of the response variable values is stationary and the series of values of the explanatory variable were nonstationary and that they were integrated of order one ( I(1) ), these tests also indicate that the random error terms are auto correlated and can be modeled according to the mixed autoregressive-moving average models ARMA(p,q), for these results we cannot use the classical estimation method to estimate our regression model, therefore, a fully modified M method was adopted, which is a robust estimation methods, The estimated results indicate a positive significant relation between the production of barley crop and cultivated area.


2017 ◽  
Vol 928 (10) ◽  
pp. 58-63 ◽  
Author(s):  
V.I. Salnikov

The initial subject for study are consistent sums of the measurement errors. It is assumed that the latter are subject to the normal law, but with the limitation on the value of the marginal error Δpred = 2m. It is known that each amount ni corresponding to a confidence interval, which provides the value of the sum, is equal to zero. The paradox is that the probability of such an event is zero; therefore, it is impossible to determine the value ni of where the sum becomes zero. The article proposes to consider the event consisting in the fact that some amount of error will change value within 2m limits with a confidence level of 0,954. Within the group all the sums have a limit error. These tolerances are proposed to use for the discrepancies in geodesy instead of 2m*SQL(ni). The concept of “the law of the truncated normal distribution with Δpred = 2m” is suggested to be introduced.


2019 ◽  
Vol 24 (3) ◽  
pp. 80 ◽  
Author(s):  
Prasert Sriboonchandr ◽  
Nuchsara Kriengkorakot ◽  
Preecha Kriengkorakot

This research project aims to study and develop the differential evolution (DE) for use in solving the flexible job shop scheduling problem (FJSP). The development of algorithms were evaluated to find the solution and the best answer, and this was subsequently compared to the meta-heuristics from the literature review. For FJSP, by comparing the problem group with the makespan and the mean relative errors (MREs), it was found that for small-sized Kacem problems, value adjusting with “DE/rand/1” and exponential crossover at position 2. Moreover, value adjusting with “DE/best/2” and exponential crossover at position 2 gave an MRE of 3.25. For medium-sized Brandimarte problems, value adjusting with “DE/best/2” and exponential crossover at position 2 gave a mean relative error of 7.11. For large-sized Dauzere-Peres and Paulli problems, value adjusting with “DE/best/2” and exponential crossover at position 2 gave an MRE of 4.20. From the comparison of the DE results with other methods, it was found that the MRE was lower than that found by Girish and Jawahar with the particle swarm optimization (PSO) method (7.75), which the improved DE was 7.11. For large-sized problems, it was found that the MRE was lower than that found by Warisa (1ST-DE) method (5.08), for which the improved DE was 4.20. The results further showed that basic DE and improved DE with jump search are effective methods compared to the other meta-heuristic methods. Hence, they can be used to solve the FJSP.


Sign in / Sign up

Export Citation Format

Share Document