Maintenance management of load haul dumper using reliability analysis

2019 ◽  
Vol 26 (2) ◽  
pp. 290-310 ◽  
Author(s):  
Balaraju Jakkula ◽  
Govinda Raj M. ◽  
Murthy Ch.S.N.

Purpose Load haul dumper (LHD) is one of the main ore transporting machineries used in underground mining industry. Reliability of LHD is very significant to achieve the expected targets of production. The performance of the equipment should be maintained at its highest level to fulfill the targets. This can be accomplished only by reducing the sudden breakdowns of component/subsystems in a complex system. The identification of defective component/subsystems can be possible by performing the downtime analysis. Hence, it is very important to develop the proper maintenance strategies for replacement or repair actions of the defective ones. Suitable maintenance management actions improve the performance of the equipment. This paper aims to discuss this issue. Design/methodology/approach Reliability analysis (renewal approach) has been used to analyze the performance of LHD machine. Allocations of best-fit distribution of data sets were made by the utilization of Kolmogorov–Smirnov (K–S) test. Parametric estimation of theoretical probability distributions was made by utilizing the maximum likelihood estimate (MLE) method. Findings Independent and identical distribution (IID) assumption of data sets was validated through trend and serial correlation tests. On the basis of test results, the data sets are in accordance with IID assumption. Therefore, renewal process approach has been utilized for further investigation. Allocations of best-fit distribution of data sets were made by the utilization of Kolmogorov–Smirnov (K–S) test. Parametric estimation of theoretical probability distributions was made by utilizing the MLE method. Reliability of each individual subsystem has been computed according to the best-fit distribution. In respect of obtained reliability results, the reliability-based preventive maintenance (PM) time schedules were calculated for the expected 90 percent reliability level. Research limitations/implications As the reliability analysis is one of the complex techniques, it requires strategic decision making knowledge for the selection of methodology to be used. As the present case study was from a public sector company, operating under financial constraints the conclusions/findings may not be universally applicable. Originality/value The present study throws light on this equipment that need a tailored maintenance schedule, partly due to the peculiar mining conditions, under which they operate. This study mainly focuses on estimating the performance of four numbers of well-mechanized LHD systems with reliability, availability and maintainability (RAM) modeling. Based on the drawn results, reasons for performance drop of each machine were identified. Suitable recommendations were suggested for the enhancement of performance of capital intensive production equipment. As the maintenance management is only the means for performance improvement of the machinery, PM time intervals were estimated with respect to the expected rate of reliability level.

2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Balaraju Jakkula ◽  
Govinda Raj Mandela ◽  
Murthy Ch S N

PurposeIn the present worldwide situation, the survival of a business is a major crucial aspect. The business cannot be succeeded unless it produces the anticipated production levels. Achievement of this can be possible only by maintaining the equipment into an adequate level. Load-Haul-Dumpers (LHDs), as the main workhorse and massive transporting machines, are highly utilized in underground mining operations. Despite the usage of LHDs, these are prone to the uneven and unexpected occurrence of potential failures. These are causes to minimize the production and productivity of capital intensive equipment. To get a good profitability index, it is very necessary to have the required levels of equipment reliability and availability. Estimation of reliabilities and availabilities play a critical role in the performance evaluation of equipment.Design/methodology/approachBy keeping the significance of the present research work in view in this research paper one of the well appropriate techniques such as fault tree analysis (FTA) was utilized to assess the reliability of the LHD system based on the function flow diagram. Best fit distribution of data sets were made by the utilization of Kolmogorov–Smirnov (K-S) test. Parametric estimation of theoretical probability distributions was done by utilizing the maximum likelihood estimation (MLE). Failure rate of each LHD system has computed based on the best fit results from “Isograph Reliability Workbench 13.0”. Reliability configuration of each LHD system has modeled using reliability block diagram (RBD), as well as the FTA.FindingsIndependent and identical distribution (IID) assumption of data sets was validated through statistic U-test (Chi Squared test). On the basis of test results, the data sets are in accordance with IID assumption. Therefore renewal process approach has been utilized for further investigation. Allocations of best fit distribution of data sets were made by the utilization ofK-S test. Parametric estimation of theoretical probability distributions was made by utilizing maximum likelihood estimation (MLE) method. Reliability of each individual subsystem has been computed according to the best fit distribution. The deductive method called RBD was utilized to investigate the given system reliability by analyzing with graphical representations of logic system and observed highest percentage of reliability as 69.44% (LH29). FTA has been utilized to investigate the availability percentage of a system and observed highest percentage value as 79.51% (LH29). This technique also helps to identify the most critical parts/cut sets by using Fussell-Vesely (F-V) importance measure.Research limitations/implicationsAs the reliability analysis is one of the complex techniques, it requires strategic decision-making knowledge for the selection of methodology to be used. As the present case study was from a public sector company, operating under financial constraints the conclusions/findings may not be universally applicable.Originality/valueThe present study throws light on this equipment that need a tailored maintenance schedules, partly due to the peculiar mining conditions, under which they operate. This analysis provides the information on several aspects such as present working condition of the machines, occurrence of various potential failure modes, influence of failure modes on its performance and reliable life aspects etc. Also, these investigations asses the forecasting of necessary managerial practices or control measures like possible design modifications and replacement actions of components to ensure the required levels of availability and utilization of the equipment. Both qualitative and quantitative analysis of FTA has been performed to determine the minimal/most influencing cut sets of the system and to estimate overall system availability within the work environment. Based on the computed results reasons for performance drop of each machine was identified and suitable recommendations were suggested to improve the performance of capital intensive systems.


2018 ◽  
Vol 35 (3) ◽  
pp. 821-842 ◽  
Author(s):  
Panagiotis Tsarouhas

Purpose The purpose of this paper is to provide results for a complete reliability, availability, and maintainability (RAM) analysis utilizing data sets from a production system in a wine packaging line. Through the illustrated case study, the author demonstrates how RAM analysis is very useful for deciding maintenance intervals, and for planning and organizing the adequate maintenance strategy. Design/methodology/approach RAM analysis has been done for each machine by using failures data. The parameters of some common probability distributions, such as Weibull, exponential, lognormal, and normal distributions, have been estimated by using the Minitab software package. An investigation to determine which of these distributions provide the best fit for characterizing the failure pattern at machine and line level has been made. Reliability and maintainability of both wine packaging and its machines has been estimated at different mission times with their best fit distribution. High maintainability issues and potential factors with their potential failure modes were presented, through failure mode and effect analysis process. Findings Analysis of the total downtime, breakdown frequency, reliability, and maintainability characteristics of different machines shows that: first, the availability for the wine packaging line was 91.80 percent, and for the remaining 8.2 percent the line is under repair. Second, about two failures per shift are displayed on the line, whereas for the mean time-to-repair (TTR) a failure is 24 minutes. Third, there is no correlation between the time-between-failures and the TTRs for the wine packaging line. Fourth, the main three factors affecting the maintainability process in the production line are: resources availability, manpower management, and maintenance planning procedures. Originality/value This study is anticipated to serve as an illuminating effort in conducting a complete RAM analysis in the much advertised field of wine packaging production line which on the other hand so little has been published on operational availability and equipment effectiveness. It can also be useful to serve as a valid data source for winery product manufacturers, who wish to improve the design and operation of their production lines.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Suyog Subhash Patil ◽  
Anand K. Bewoor

PurposeIndia's textile industries play a vital role in the Indian economy. These industries consume the highest thermal energy (steam power). The demand of the steam in process industries is increasing rapidly, and this demand can be met by increasing the capacity utilization of steam boilers. The purpose of this paper is to present a new approach for reliability analysis by expert judgment method.Design/methodology/approachA lack of adequate life data is one of the biggest challenge in the reliability analysis of mechanical systems. This research provides an expert judgment approach for assessing the boiler's reliability characteristics. For this purpose, opinions of experts on time to failure and time to repair data were elicited in the form of statistical distributions. In this work, reliability analysis of the boiler system is carried out by expert judgment method and by using best-fit failure model. The system reliability along with preventive maintenance intervals of all components is also evaluated.FindingsIt is observed that the reliability analysis results obtained by expert judgment method and best-fit failure model method indicate that there are no significant differences. Therefore, in case when insufficient data are available, the expert judgment method can be effectively used. The analysis shows that the feedwater tank, feedwater pump, supply water temperature sensor, strainer, return water temperature sensor, condensate filter, mechanical dust collector, coal crusher and fusible plug are identified as critical components from a reliability perspective, and preventive maintenance strategy is suggested for these components.Originality/valueIn this research paper, a system reliability model by the expert judgment method is developed, and it can be effectively used where insufficient failure data are available. This paper is useful for the comparative evaluation of reliability characteristics of a boiler system by expert judgment method and best-fit failure model method.


2004 ◽  
Vol 101 (Supplement3) ◽  
pp. 326-333 ◽  
Author(s):  
Klaus D. Hamm ◽  
Gunnar Surber ◽  
Michael Schmücking ◽  
Reinhard E. Wurm ◽  
Rene Aschenbach ◽  
...  

Object. Innovative new software solutions may enable image fusion to produce the desired data superposition for precise target definition and follow-up studies in radiosurgery/stereotactic radiotherapy in patients with intracranial lesions. The aim is to integrate the anatomical and functional information completely into the radiation treatment planning and to achieve an exact comparison for follow-up examinations. Special conditions and advantages of BrainLAB's fully automatic image fusion system are evaluated and described for this purpose. Methods. In 458 patients, the radiation treatment planning and some follow-up studies were performed using an automatic image fusion technique involving the use of different imaging modalities. Each fusion was visually checked and corrected as necessary. The computerized tomography (CT) scans for radiation treatment planning (slice thickness 1.25 mm), as well as stereotactic angiography for arteriovenous malformations, were acquired using head fixation with stereotactic arc or, in the case of stereotactic radiotherapy, with a relocatable stereotactic mask. Different magnetic resonance (MR) imaging sequences (T1, T2, and fluid-attenuated inversion-recovery images) and positron emission tomography (PET) scans were obtained without head fixation. Fusion results and the effects on radiation treatment planning and follow-up studies were analyzed. The precision level of the results of the automatic fusion depended primarily on the image quality, especially the slice thickness and the field homogeneity when using MR images, as well as on patient movement during data acquisition. Fully automated image fusion of different MR, CT, and PET studies was performed for each patient. Only in a few cases was it necessary to correct the fusion manually after visual evaluation. These corrections were minor and did not materially affect treatment planning. High-quality fusion of thin slices of a region of interest with a complete head data set could be performed easily. The target volume for radiation treatment planning could be accurately delineated using multimodal information provided by CT, MR, angiography, and PET studies. The fusion of follow-up image data sets yielded results that could be successfully compared and quantitatively evaluated. Conclusions. Depending on the quality of the originally acquired image, automated image fusion can be a very valuable tool, allowing for fast (∼ 1–2 minute) and precise fusion of all relevant data sets. Fused multimodality imaging improves the target volume definition for radiation treatment planning. High-quality follow-up image data sets should be acquired for image fusion to provide exactly comparable slices and volumetric results that will contribute to quality contol.


2019 ◽  
Vol 45 (9) ◽  
pp. 1183-1198
Author(s):  
Gaurav S. Chauhan ◽  
Pradip Banerjee

Purpose Recent papers on target capital structure show that debt ratio seems to vary widely in space and time, implying that the functional specifications of target debt ratios are of little empirical use. Further, target behavior cannot be adjudged correctly using debt ratios, as they could revert due to mechanical reasons. The purpose of this paper is to develop an alternative testing strategy to test the target capital structure. Design/methodology/approach The authors make use of a major “shock” to the debt ratios as an event and think of a subsequent reversion as a movement toward a mean or target debt ratio. By doing this, the authors no longer need to identify target debt ratios as a function of firm-specific variables or any other rigid functional form. Findings Similar to the broad empirical evidence in developed economies, there is no perceptible and systematic mean reversion by Indian firms. However, unlike developed countries, proportionate usage of debt to finance firms’ marginal financing deficits is extensive; equity is used rather sparingly. Research limitations/implications The trade-off theory could be convincingly refuted at least for the emerging market of India. The paper here stimulated further research on finding reasons for specific financing behavior of emerging market firms. Practical implications The results show that the firms’ financing choices are not only depending on their own firm’s specific variables but also on the financial markets in which they operate. Originality/value This study attempts to assess mean reversion in debt ratios in a unique but reassuring manner. The results are confirmed by extensive calibration of the testing strategy using simulated data sets.


Genetics ◽  
2003 ◽  
Vol 163 (3) ◽  
pp. 1177-1191 ◽  
Author(s):  
Gregory A Wilson ◽  
Bruce Rannala

Abstract A new Bayesian method that uses individual multilocus genotypes to estimate rates of recent immigration (over the last several generations) among populations is presented. The method also estimates the posterior probability distributions of individual immigrant ancestries, population allele frequencies, population inbreeding coefficients, and other parameters of potential interest. The method is implemented in a computer program that relies on Markov chain Monte Carlo techniques to carry out the estimation of posterior probabilities. The program can be used with allozyme, microsatellite, RFLP, SNP, and other kinds of genotype data. We relax several assumptions of early methods for detecting recent immigrants, using genotype data; most significantly, we allow genotype frequencies to deviate from Hardy-Weinberg equilibrium proportions within populations. The program is demonstrated by applying it to two recently published microsatellite data sets for populations of the plant species Centaurea corymbosa and the gray wolf species Canis lupus. A computer simulation study suggests that the program can provide highly accurate estimates of migration rates and individual migrant ancestries, given sufficient genetic differentiation among populations and sufficient numbers of marker loci.


Stats ◽  
2021 ◽  
Vol 4 (1) ◽  
pp. 184-204
Author(s):  
Carlos Barrera-Causil ◽  
Juan Carlos Correa ◽  
Andrew Zamecnik ◽  
Francisco Torres-Avilés ◽  
Fernando Marmolejo-Ramos

Expert knowledge elicitation (EKE) aims at obtaining individual representations of experts’ beliefs and render them in the form of probability distributions or functions. In many cases the elicited distributions differ and the challenge in Bayesian inference is then to find ways to reconcile discrepant elicited prior distributions. This paper proposes the parallel analysis of clusters of prior distributions through a hierarchical method for clustering distributions and that can be readily extended to functional data. The proposed method consists of (i) transforming the infinite-dimensional problem into a finite-dimensional one, (ii) using the Hellinger distance to compute the distances between curves and thus (iii) obtaining a hierarchical clustering structure. In a simulation study the proposed method was compared to k-means and agglomerative nesting algorithms and the results showed that the proposed method outperformed those algorithms. Finally, the proposed method is illustrated through an EKE experiment and other functional data sets.


2021 ◽  
Vol 5 (1) ◽  
pp. 10
Author(s):  
Mark Levene

A bootstrap-based hypothesis test of the goodness-of-fit for the marginal distribution of a time series is presented. Two metrics, the empirical survival Jensen–Shannon divergence (ESJS) and the Kolmogorov–Smirnov two-sample test statistic (KS2), are compared on four data sets—three stablecoin time series and a Bitcoin time series. We demonstrate that, after applying first-order differencing, all the data sets fit heavy-tailed α-stable distributions with 1<α<2 at the 95% confidence level. Moreover, ESJS is more powerful than KS2 on these data sets, since the widths of the derived confidence intervals for KS2 are, proportionately, much larger than those of ESJS.


2017 ◽  
Vol 26 (11) ◽  
pp. 1750124 ◽  
Author(s):  
E. Ebrahimi ◽  
H. Golchin ◽  
A. Mehrabi ◽  
S. M. S. Movahed

In this paper, we investigate ghost dark energy model in the presence of nonlinear interaction between dark energy and dark matter. We also extend the analysis to the so-called generalized ghost dark energy (GGDE) which [Formula: see text]. The model contains three free parameters as [Formula: see text] and [Formula: see text] (the coupling coefficient of interactions). We propose three kinds of nonlinear interaction terms and discuss the behavior of equation of state, deceleration and dark energy density parameters of the model. We also find the squared sound speed and search for signs of stability of the model. To compare the interacting GGDE model with observational data sets, we use more recent observational outcomes, namely SNIa from JLA catalog, Hubble parameter, baryonic acoustic oscillation and the most relevant CMB parameters including, the position of acoustic peaks, shift parameters and redshift to recombination. For GGDE with the first nonlinear interaction, the joint analysis indicates that [Formula: see text], [Formula: see text] and [Formula: see text] at 1 optimal variance error. For the second interaction, the best fit values at [Formula: see text] confidence are [Formula: see text], [Formula: see text] and [Formula: see text]. According to combination of all observational data sets considered in this paper, the best fit values for third nonlinearly interacting model are [Formula: see text], [Formula: see text] and [Formula: see text] at [Formula: see text] confidence interval. Finally, we found that the presence of interaction is compatible in mentioned models via current observational datasets.


2014 ◽  
Vol 6 (2) ◽  
pp. 211-233
Author(s):  
Thomas M. Bayer ◽  
John Page

Purpose – This paper aims to analyze the evolution of the marketing of paintings and related visual products from its nascent stages in England around 1700 to the development of the modern art market by 1900, with a brief discussion connecting to the present. Design/methodology/approach – Sources consist of a mixture of primary and secondary sources as well as a series of econometric and statistical analyses of specifically constructed and unique data sets that list nearly more than 50,000 different sales of paintings during this period. One set records sales of paintings at various English auction houses during the eighteenth and nineteenth centuries; the second set consists of all purchases and sales of paintings recorded in the stock books of the late nineteenth-century London art dealer, Arthur Tooth, during the years of 1870/1871. The authors interpret the data under a commoditization model first introduced by Igor Kopytoff in 1986 that posits that markets and their participants evolve toward maximizing the efficiency of their exchange process within the prevailing exchange technology. Findings – We found that artists were largely responsible for a series of innovations in the art market that replaced the prevailing direct relationship between artists and patron with a modern market for which painters produced works on speculation to be sold by enterprising middlemen to an anonymous public. In this process, artists displayed a remarkable creativity and a seemingly instinctive understanding of the principles of competitive marketing that should dispel the erroneous but persistent notion that artistic genius and business savvy are incompatible. Research limitations/implications – A similar marketing analysis could be done of the development of the art markets of other leading countries, such as France, Italy and Holland, as well as the current developments of the art market. Practical implications – The same process of the development of the art market in England is now occurring in Latin America and China. Also, the commoditization process continues in the present, now using the Internet and worldwide art dealers. Originality/value – This is the first article to trace the historical development of the marketing of art in all of its components: artists, dealers, artist organizations, museums, curators, art critics, the media and art historians.


Sign in / Sign up

Export Citation Format

Share Document