International Journal of Quality & Reliability Management
Latest Publications


TOTAL DOCUMENTS

2236
(FIVE YEARS 372)

H-INDEX

74
(FIVE YEARS 8)

Published By Emerald (Mcb Up )

0265-671x

2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Ashutosh Shankhdhar ◽  
Pawan Kumar Verma ◽  
Prateek Agrawal ◽  
Vishu Madaan ◽  
Charu Gupta

PurposeThe aim of this paper is to explore the brain–computer interface (BCI) as a methodology for generating awareness and increasing reliable use cases of the same so that an individual's quality of life can be enhanced via neuroscience and neural networks, and risk evaluation of certain experiments of BCI can be conducted in a proactive manner.Design/methodology/approachThis paper puts forward an efficient approach for an existing BCI device, which can enhance the performance of an electroencephalography (EEG) signal classifier in a composite multiclass problem and investigates the effects of sampling rate on feature extraction and multiple channels on the accuracy of a complex multiclass EEG signal. A one-dimensional convolutional neural network architecture is used to further classify and improve the quality of the EEG signals, and other algorithms are applied to test their variability. The paper further also dwells upon the combination of internet of things multimedia technology to be integrated with a customized design BCI network based on a conventionally used system known as the message query telemetry transport.FindingsAt the end of our implementation stage, 98% accuracy was achieved in a binary classification problem of classifying digit and non-digit stimuli, and 36% accuracy was observed in the classification of signals resulting from stimuli of digits 0 to 9.Originality/valueBCI, also known as the neural-control interface, is a device that helps a user reliably interact with a computer using only his/her brain activity, which is measured usually via EEG. An EEG machine is a quality device used for observing the neural activity and electric signals generated in certain parts of the human brain, which in turn can help us in studying the different core components of the human brain and how it functions to improve the quality of human life in general.


2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Daniel Ashagrie Tegegne ◽  
Daniel Kitaw Azene ◽  
Eshetie Berhan Atanaw

PurposeThis study aims to design a multivariate control chart that improves the applicability of the traditional Hotelling T2 chart. This new type of multivariate control chart displays sufficient information about the states and relationships of the variables in the production process. It is used to make better quality control decisions during the production process.Design/methodology/approachMultivariate data are collected at an equal time interval and are represented by nodes of the graph. The edges connecting the nodes represent the sequence of operation. Each node is plotted on the control chart based on their Hotelling T2 statistical distance. The changing behavior of each pair of input and output nodes is studied by the neural network. A case study from the cement industry is conducted to validate the control chart.FindingsThe finding of this paper is that the points and lines in the classic Hotelling T2 chart are effectively substituted by nodes and edges of the graph respectively. Nodes and edges have dimension and color and represent several attributes. As a result, this control chart displays much more information than the traditional Hotelling T2 control chart. The pattern of the plot represents whether the process is normal or not. The effect of the sequence of operation is visible in the control chart. The frequency of the happening of nodes is recognized by the size of nodes. The decision to change the product feature is assisted by finding the shortest path between nodes. Moreover, consecutive nodes have different behaviors, and that behavior change is recognized by neural network.Originality/valueModifying the classical Hotelling T2 control chart by integrating with the concept of graph theory and neural network is new of its kind.


2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Angelo Marcio Oliveira Sant’Anna

PurposeE-waste management can reduce relevant impact of the business activity without affecting reliability, quality or performance. Statistical process monitoring is an effective way for managing reliability and quality to devices in manufacturing processes. This paper proposes an approach for monitoring the proportion of e-waste devices based on Beta regression model and particle swarm optimization. A statistical process monitoring scheme integrating residual useful life techniques for efficient monitoring of e-waste components or equipment was developed.Design/methodology/approachAn approach integrating regression method and particle swarm optimization algorithm was developed for increasing the accuracy of regression model estimates. The control chart tools were used for monitoring the proportion of e-waste devices from fault detection of electronic devices in manufacturing process.FindingsThe results showed that the proposed statistical process monitoring was an excellent reliability and quality scheme for monitoring the proportion of e-waste devices in toner manufacturing process. The optimized regression model estimates showed a significant influence of the process variables for both individually injection rate and toner treads and the interactions between injection rate, toner treads, viscosity and density.Originality/valueThis research is different from others by providing an approach for modeling and monitoring the proportion of e-waste devices. Statistical process monitoring can be used to monitor waste product in manufacturing. Besides, the key contribution in this study is to develop different models for fault detection and identify any change point in the manufacturing process. The optimized model used can be replicated to other Electronic Industry and allows support of a satisfactory e-waste management.


2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Ilesanmi Daniyan ◽  
Khumbulani Mpofu ◽  
Samuel Nwankwo

PurposeThe need to examine the integrity of infrastructure in the rail industry in order to improve its reliability and reduce the chances of breakdown due to defects has brought about development of an inspection and diagnostic robot.Design/methodology/approachIn this study, an inspection robot was designed for detecting crack, corrosion, missing clips and wear on rail track facilities. The robot is designed to use infrared and ultrasonic sensors for obstacles avoidance and crack detection, two 3D-profilometer for wear detection as well as cameras with high resolution to capture real time images and colour sensors for corrosion detection. The robot is also designed with cameras placed in front of it with colour sensors at each side to assist in the detection of corrosion in the rail track. The image processing capability of the robot will permit the analysis of the type and depth of the crack and corrosion captured in the track. The computer aided design and modeling of the robot was carried out using the Solidworks software version 2018 while the simulation of the proposed system was carried out in the MATLAB 2020b environment.FindingsThe results obtained present three frameworks for wear, corrosion and missing clips as well as crack detection. In addition, the design data for the development of the integrated robotic system is also presented in the work. The confusion matrix resulting from the simulation of the proposed system indicates significant sensitivity and accuracy of the system to the presence and detection of fault respectively. Hence, the work provides a design framework for detecting and analysing the presence of defects on the rail track.Practical implicationsThe development and the implementation of the designed robot will bring about a more proactive way to monitor rail track conditions and detect rail track defects so that effort can be geared towards its restoration before it becomes a major problem thus increasing the rail network capacity and availability.Originality/valueThe novelty of this work is based on the fact that the system is designed to work autonomously to avoid obstacles and check for cracks, missing clips, wear and corrosion in the rail tracks with a system of integrated and coordinated components.


2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Rajkumar Bhimgonda Patil ◽  
Suyog Subhash Patil ◽  
Gajanand Gupta ◽  
Anand K. Bewoor

PurposeThe purpose of this paper is to carry out a reliability analysis of a mechanical system considering the degraded states to get a proper understanding of system behavior and its propagation towards complete failure.Design/methodology/approachThe reliability analysis of computerized numerical control machine tools (CNCMTs) using a multi-state system (MSS) approach that considers various degraded states rather than a binary approach is carried out. The failures of the CNCMT are classified into five states: one fully operational state, three degraded states and one failed state.FindingsThe analysis of failure data collected from the field and tests conducted in the laboratory provided detailed understandings about the quality of the material and its failure behavior used in designing and the capability of the manufacturing system. The present work identified that Class II (major failure) is critical from a maintainability perspective whereas Class III (moderate failure) and Class IV (minor failure) are critical from a reliability perspective.Research limitations/implicationsThis research applies to reliability data analysis of systems that consider various degraded states.Practical implicationsMSS reliability analysis approach will help to identify various degraded states of the system that affect the performance and productivity and also to improve system reliability, availability and performance.Social implicationsIndustrial system designers recognized that reliability and maintainability is a critical design attribute. Reliability studies using the binary state approach are insufficient and incorrect for the systems with degraded failures states, and such analysis can give incorrect results, and increase the cost. The proposed MSS approach is more suitable for complex systems such as CNCMT rather than the binary-state system approach.Originality/valueThis paper presents a generalized framework MSS's failure and repair data analysis has been developed and applied to a CNCMT.


2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Ramon Swell Gomes Rodrigues Casado ◽  
Maisa Mendonca Silva ◽  
Lucio Camara Silva

PurposeThe paper aims to propose a multi-criteria model for risk prioritisation associated to supply chain management involving multiple decision-makers.Design/methodology/approachThe model integrates the composition of probabilistic preferences (CPP) on the failure modes analysis and its effects (FMEA) criteria. First, the authors carried out a probabilistic transformation of the numerical evaluations of the multiple decision-makers on the FMEA criteria regarding the internal risks that affect the chain of clothing pole in the Agreste region of Pernambuco. Then, the authors proposed the use of the Kendall's concordance coefficient W to aggregate these evaluations.FindingsContrary to expectations, the two main risks to be investigated as a model suggestion was related to the context of supply chain suppliers and not related to the raw material costs. Besides, a simulation with the traditional FMEA was carried out, and comparing with the model result, the simulation is worth highlighting seven consistent differences along the two rankings.Research limitations/implicationsThe focus was restricted to the use of only internal chain risks.Practical implicationsThe proposed model can contribute to the improvement of the decisions within organisations that make up the chains, thus guaranteeing a better quality in risk management.Originality/valueEstablishing a more effective representation of uncertain information related to traditional FMEA treatment involving multiple decision-makers means identifying in advance the potential risks, providing a better supply chain control.


2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Shyamkumar D. Kalpande ◽  
Lalit K. Toke

PurposeThis paper deals with concept of total productive maintenance (TPM) and its implementation approach. It also presents the identification of critical factors for effective implementation of TPM. The reliability analysis identified potential areas where more concentration is required. The application of hypothesis testing in productivity maintenance should be promoted by parametric test and significantly instrumental in explanation of phenomena. It is also indispensable to better understand quality data and provide guidance to production control.Design/methodology/approachThe various critical success factors of TPM implementation has organised into set of eight performance measure and thirty three sub-factors for getting the in-depth details of each indicator. The paper identifies the reliability of these factors and understands the problem with greater clarity and its ramification. Researcher collected responses from forty one manufacturing organisations through structured designed questionnaire. The reliability analysis was carriedout by calculating the value of Cronbach's alpha method. To draw the meaningful conclusions supported by relevant empirical data, provisional formulation is required, and it was carried by hypothesis testing. In this test, samples are taken from a population with known distribution (normal distribution), and a test of population parameters is executed. It determines the relevancy of facts directs the researcher's efforts into productive channels. The statements were hypothetically tested by calculating the arithmetic value of Chi-Square (χ2) and MINITAB-19 software was used for identification of p-value.FindingsThis study identified that main factors and sub-factors of TPM which are critical for implementation of TPM. The study also avoids the complexities involved in implementing TPM by reliability analysis. It is found that all identified CSFs are reliable as Cronbach's alpha is above 0.6. The hypothesis testing shows that all alternative hypothesis statements are acceptable as Chi-Square (χ2) value has satisfied the conditions and null hypothesis are true as calculated p-value is less than the 0.05 for eight identified TPM critical factor.Originality/valueIn this paper researcher provides a comprehensive typology of TPM-CSFs, and its ranking and importance in manufacturing sector. The preparedness of such study related to TPM implementation is becoming a major sourcing base for the world and there is a paucity of such studies. Such studies are equally important in a global context.


2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Abbas Bin Jibril ◽  
V.V. Singh ◽  
Dilip Kumar Rawal

PurposeThe purpose of this paper is to deliberate the system reliability of a system in combination of three subsystems in a series configuration in which all three subsystems function under a k-out-of-n: G operational scheme. Based on computed results, it has been demonstrated that copula repair is better than general repair for system better performance. The supplementary variable approach with implications of copula distribution has been employed for assessing the system performance.Design/methodology/approachProbabilistic assessment of complex system consisting three subsystems, multi-failure threats and copula repair approach is used in this study. Abbas Jubrin Bin, V.V. Singh, D.K. Rawal, in this research paper, have analyzed a system consisting of three subsystems in a series configuration in which all three subsystems function under a k-out-of-n: G operational scheme. The supplementary variable approach with implications of copula distribution has been employed for assessing the system performance. Based on computed results, it has been demonstrated that copula repair is better than general repair for system better performance.FindingsIn this analysis, four different cases of availability are analysed for Gumbel–Hougaard family copula and also four cases for general repair with similar failure rates are studied. The authors found that when failure rates increase, the system availability decreases, and when the system follows copula repair distribution, the system availability is better than general repair.Research limitations/implicationsThis research may be implemented in various industrial systems where the subsystems are configured under k-out-of-n: G working policy. It is also advisable that copula repair is highly recommended for best performances from the system. On the basis of mean time to system failure (MTSF) computations, the failure rate which affects system failure more needs to be controlled by monitoring, servicing and replacing stratagem.Practical implicationsThis research work has great implications in various industrial systems like power plant systems, nuclear power plant, electricity distributions system, etc. where the k-out-of-n-type of system operation scheme is validated for system operations with the multi-repair.Originality/valueThis work is a new work by authors. In the previously available technical analysis of the system, the researchers have analyzed the repairable system either supplementary variable approach, supplementary variable and system which have two subsystems in a series configuration. This research work analyzed a system with three subsystems with a multi-repair approach and supplementary variables.


2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Satish Kumar ◽  
Tushar Kolekar ◽  
Ketan Kotecha ◽  
Shruti Patil ◽  
Arunkumar Bongale

Purpose Excessive tool wear is responsible for damage or breakage of the tool, workpiece, or machining center. Thus, it is crucial to examine tool conditions during the machining process to improve its useful functional life and the surface quality of the final product. AI-based tool wear prediction techniques have proven to be effective in estimating the Remaining Useful Life (RUL) of the cutting tool. However, the model prediction needs improvement in terms of accuracy.Design/methodology/approachThis paper represents a methodology of fusing a feature selection technique along with state-of-the-art deep learning models. The authors have used NASA milling data sets along with vibration signals for tool wear prediction and performance analysis in 15 different fault scenarios. Multiple steps are used for the feature selection and ranking. Different Long Short-Term Memory (LSTM) approaches are used to improve the overall prediction accuracy of the model for tool wear prediction. LSTM models' performance is evaluated using R-square, Mean Absolute Error (MAE), Root Mean Square Error (RMSE) and Mean Absolute Percentage Error (MAPE) parameters.FindingsThe R-square accuracy of the hybrid model is consistently high and has low MAE, MAPE and RMSE values. The average R-square score values for LSTM, Bidirection, Encoder–Decoder and Hybrid LSTM are 80.43, 84.74, 94.20 and 97.85%, respectively, and corresponding average MAPE values are 23.46, 22.200, 9.5739 and 6.2124%. The hybrid model shows high accuracy as compared to the remaining LSTM models.Originality/value The low variance, Spearman Correlation Coefficient and Random Forest Regression methods are used to select the most significant feature vectors for training the miscellaneous LSTM model versions and highlight the best approach. The selected features pass to different LSTM models like Bidirectional, Encoder–Decoder and Hybrid LSTM for tool wear prediction. The Hybrid LSTM approach shows a significant improvement in tool wear prediction.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Jiju Antony ◽  
Michael Sony ◽  
Olivia McDermott ◽  
Raja Jayaraman ◽  
David Flynn

Purpose Quality 4.0 incorporates the role of automation and digitization and provides competitive advantage for organizations by enhancing customer experience and increase profitability. The purpose of this study is to critically examine the organizational readiness factors for the successful implementation of Quality 4.0 implementation and assess their importance.Design/methodology/approach This study applies a quantitative research methodology to examine readiness factors of Quality 4.0 in organizations by 147 senior management professionals in various organizations including manufacturing and service companies in America, Asia and Europe participated through an online survey.FindingsThe readiness factors for Quality 4.0 were critically ranked amongst manufacturing and service organizations by senior management professionals from three continents. Five significant reasons for non-adoption of Quality 4.0 were lack of resources, inability to link Quality 4.0 with the corporate strategy and objectives, lack of understanding of benefits, high initial investment and the current quality management strategy and methods are already delivering good results hence unsure of the need for Quality 4.0. The handling of big data in quality management was the most important factor for adopting Quality 4.0, irrespective of the size and nature of the organization. More accuracy and less errors and improved decision-making the factors of adopting Quality 4.0 in service sector were not significant for manufacturing sector. Small and medium-sized enterprises (SMEs) reported that costs and time savings over the long run were not so significant.Practical implications This study is focussed on the significance of pros and cons of adopting Quality 4.0 in organizations. Senior managers in both large and SMEs can benefit immensely from understanding before investing heavily towards implementing Quality 4.0. The importance of identified organizational readiness factors for the successful adoption of Quality 4.0 can be used as indicators to understand how ready an organization is to implement Quality 4.0. The top three readiness factors for the successful adoption of Quality 4.0 were identified as: top management commitment, leadership and organizational culture. Improved understanding of the readiness factors can be highly beneficial to senior quality professionals in both manufacturing and service companies in the journey towards successful implementation of Quality 4.0.Originality/value This is the first empirical study on assessing Quality 4.0 readiness factors at an intercontinental level and therefore serves as a foundation for many future studies. The study provides a theoretical foundation for the Quality 4.0 in terms of organizational readiness for successful adoption and overcoming implementation challenges. During the planning, implementation and progress review of Quality 4.0, review the readiness factors while planning and resourcing a Quality 4.0 implementation strategy to ensure effective performance.


Sign in / Sign up

Export Citation Format

Share Document