scholarly journals Camera-Based In-Process Quality Measurement of Hairpin Welding

2021 ◽  
Vol 11 (21) ◽  
pp. 10375
Author(s):  
Julia Hartung ◽  
Andreas Jahn ◽  
Oliver Bocksrocker ◽  
Michael Heizmann

The technology of hairpin welding, which is frequently used in the automotive industry, entails high-quality requirements in the welding process. It can be difficult to trace the defect back to the affected weld if a non-functioning stator is detected during the final inspection. Often, a visual assessment of a cooled weld seam does not provide any information about its strength. However, based on the behavior during welding, especially about spattering, conclusions can be made about the quality of the weld. In addition, spatter on the component can have serious consequences. In this paper, we present in-process monitoring of laser-based hairpin welding. Using an in-process image analyzed by a neural network, we present a spatter detection method that allows conclusions to be drawn about the quality of the weld. In this way, faults caused by spattering can be detected at an early stage and the affected components sorted out. The implementation is based on a small data set and under consideration of a fast process time on hardware with limited computing power. With a network architecture that uses dilated convolutions, we obtain a large receptive field and can therefore consider feature interrelation in the image. As a result, we obtain a pixel-wise classifier, which allows us to infer the spatter areas directly on the production lines.


1995 ◽  
Vol 7 (1) ◽  
pp. 86-107 ◽  
Author(s):  
G. Deco ◽  
W. Finnoff ◽  
H. G. Zimmermann

Controlling the network complexity in order to prevent overfitting is one of the major problems encountered when using neural network models to extract the structure from small data sets. In this paper we present a network architecture designed for use with a cost function that includes a novel complexity penalty term. In this architecture the outputs of the hidden units are strictly positive and sum to one, and their outputs are defined as the probability that the actual input belongs to a certain class formed during learning. The penalty term expresses the mutual information between the inputs and the extracted classes. This measure effectively describes the network complexity with respect to the given data in an unsupervised fashion. The efficiency of this architecture/penalty-term when combined with backpropagation training, is demonstrated on a real world economic time series forecasting problem. The model was also applied to the benchmark sunspot data and to a synthetic data set from the statistics community.



2020 ◽  
Author(s):  
Ning Wang ◽  
Vishal Peddagangireddy ◽  
Fan Luo ◽  
K.P. Subbalakshmi ◽  
R. Chandramouli

AbstractAlzheimer’s disease (AD)-related global healthcare cost is estimated to be $1 trillion by 2050. Currently, there is no cure for this disease; however, clinical studies show that early diagnosis and intervention helps to extend the quality of life and inform technologies for personalized mental healthcare. Clinical research indicates that the onset and progression of Alzheimer’s disease lead to dementia and other mental health issues. As a result, the language capabilities of patient start to decline.In this paper, we show that machine learning-based unsupervised clustering of and anomaly detection with linguistic biomarkers are promising approaches for intuitive visualization and personalized early stage detection of Alzheimer’s disease. We demonstrate this approach on 10 year’s (1980 to 1989) of President Ronald Reagan’s speech data set. Key linguistic biomarkers that indicate early-stage AD are identified. Experimental results show that Reagan had early onset of Alzheimer’s sometime between 1983 and 1987. This finding is corroborated by prior work that analyzed his interviews using a statistical technique. The proposed technique also identifies the exact speeches that reflect linguistic biomarkers for early stage AD.



2020 ◽  
pp. 12-18
Author(s):  
F.A. Urazbahtin ◽  
A.YU. Urazbahtina

A multifactor mathematical model of the welding process of products from aluminum-magnesium alloys, consisting of 71 indicators that assess the quality of the weld, the welding process, costs, equipment operation and quality of the welded material. The model can be used to control and optimize the welding process of products from aluminum-magnesium alloys. Keywords welding, products, aluminum-magnesium alloy, indicators, process parameters, welding equipment, welding materials, electrode sharpening, lining [email protected]



Author(s):  
Varun Sapra ◽  
M.L Saini ◽  
Luxmi Verma

Background: Cardiovascular diseases are increasing at an alarming rate with very high rate of mortality. Coronary artery disease is one of the type of cardiovascular disease, which is not easily diagnosed in its early stage. Prevention of Coronary Artery Disease is possible only if it is diagnosed, at early stage and proper medication is done. Objective: An effective diagnosis model is important not only for the early diagnosis but also to check the severity of the disease. Method: In this paper, a hybrid approach is followed, with the integration of deep learning (multi-layer perceptron) with Case based reasoning to design analytical framework. This paper suggests two phases of the study, one in which the patient is diagnosed for Coronary artery disease and in second phase, if the patient is suffering from the disease then employing Case based reasoning to diagnose the severity of the disease. In the first phase, multilayer perceptron is implemented on reduced dataset and with time-based learning for stochastic gradient descent respectively. Results: The classification accuracy is increase by 4.18 % with reduced data set using deep neural network with time based learning. In second phase, if the patient is diagnosed as positive for Coronary artery disease, then it triggers the Case based reasoning system to retrieve from the case base, the most similar case to predict the severity for that patient. The CBR model achieved 97.3% accuracy. Conclusion: The model can be very useful for medical practitioners as a supporting decision system and thus can save the patients from unnecessary medical expenses on costly tests and can improve the quality and effectiveness of medical treatment.



2012 ◽  
Vol 197 ◽  
pp. 271-277
Author(s):  
Zhu Ping Gong

Small data set approach is used for the estimation of Largest Lyapunov Exponent (LLE). Primarily, the mean period drawback of Small data set was corrected. On this base, the LLEs of daily qualified rate time series of HZ, an electronic manufacturing enterprise, were estimated and all positive LLEs were taken which indicate that this time series is a chaotic time series and the corresponding produce process is a chaotic process. The variance of the LLEs revealed the struggle between the divergence nature of quality system and quality control effort. LLEs showed sharp increase in getting worse quality level coincide with the company shutdown. HZ’s daily qualified rate, a chaotic time series, shows us the predictable nature of quality system in a short-run.



2021 ◽  
pp. 100157
Author(s):  
Somaresh Kumar Mondal ◽  
Abdul Gaffar Khan ◽  
Md. Mamun Ali ◽  
Mir Kaosar Ahamed ◽  
Kawsar Ahmed


Sensors ◽  
2021 ◽  
Vol 21 (3) ◽  
pp. 863
Author(s):  
Vidas Raudonis ◽  
Agne Paulauskaite-Taraseviciene ◽  
Kristina Sutiene

Background: Cell detection and counting is of essential importance in evaluating the quality of early-stage embryo. Full automation of this process remains a challenging task due to different cell size, shape, the presence of incomplete cell boundaries, partially or fully overlapping cells. Moreover, the algorithm to be developed should process a large number of image data of different quality in a reasonable amount of time. Methods: Multi-focus image fusion approach based on deep learning U-Net architecture is proposed in the paper, which allows reducing the amount of data up to 7 times without losing spectral information required for embryo enhancement in the microscopic image. Results: The experiment includes the visual and quantitative analysis by estimating the image similarity metrics and processing times, which is compared to the results achieved by two wellknown techniques—Inverse Laplacian Pyramid Transform and Enhanced Correlation Coefficient Maximization. Conclusion: Comparatively, the image fusion time is substantially improved for different image resolutions, whilst ensuring the high quality of the fused image.



Sign in / Sign up

Export Citation Format

Share Document