Tool wear predicting based on multi-domain feature fusion by deep convolutional neural network in milling operations

2019 ◽  
Vol 31 (4) ◽  
pp. 953-966 ◽  
Author(s):  
Zhiwen Huang ◽  
Jianmin Zhu ◽  
Jingtao Lei ◽  
Xiaoru Li ◽  
Fengqing Tian
2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Zhiwen Huang ◽  
Jianmin Zhu ◽  
Jingtao Lei ◽  
Xiaoru Li ◽  
Fengqing Tian

Tool wear monitoring is essential in precision manufacturing to improve surface quality, increase machining efficiency, and reduce manufacturing cost. Although tool wear can be reflected by measurable signals in automatic machining operations, with the increase of collected data, features are manually extracted and optimized, which lowers monitoring efficiency and increases prediction error. For addressing the aforementioned problems, this paper proposes a tool wear monitoring method using vibration signal based on short-time Fourier transform (STFT) and deep convolutional neural network (DCNN) in milling operations. First, the image representation of acquired vibration signals is obtained based on STFT, and then the DCNN model is designed to establish the relationship between obtained time-frequency maps and tool wear, which performs adaptive feature extraction and automatic tool wear prediction. Moreover, this method is demonstrated by employing three tool wear experimental datasets collected from three-flute ball nose tungsten carbide cutter of a high-speed CNC machine under dry milling. Finally, the experimental results prove that the proposed method is more accurate and relatively reliable than other compared methods.


2021 ◽  
Author(s):  
Guofa Li ◽  
Yanbo Wang ◽  
Jialong He ◽  
Yongchao Huo

Abstract Tool wear during machining has a great influence on the quality of machined surface and dimensional accuracy. Tool wear monitoring is extremely important to improve machining efficiency and workpiece quality. Multidomain features (time domain, frequency domain and time-frequency domain) can accurately characterise the degree of tool wear. However, manual feature fusion is time consuming and prevents the improvement of monitoring accuracy. A new tool wear prediction method based on multidomain feature fusion by attention-based depth-wise separable convolutional neural network is proposed to solve these problems. In this method, multidomain features of cutting force and vibration signals are extracted and recombined into feature tensors. The proposed hypercomplex position encoding and high dimensional self-attention mechanism are used to calculate the new representation of input feature tensor, which emphasizes the tool wear sensitive information and suppresses large area background noise. The designed depth-wise separable convolutional neural network is used to adaptively extract high-level features that can characterize tool wear from the new representation, and the tool wear is predicted automatically. The proposed method is verified on three sets of tool run-to-failure data sets of three-flute ball nose cemented carbide tool in machining centre. Experimental results show that the prediction accuracy of the proposed method is remarkably higher than other state-of-art methods. Therefore, the proposed tool wear prediction method is beneficial to improve the prediction accuracy and provide effective guidance for decision making in processing.


2020 ◽  
Vol 2020 (4) ◽  
pp. 4-14
Author(s):  
Vladimir Budak ◽  
Ekaterina Ilyina

The article proposes the classification of lenses with different symmetrical beam angles and offers a scale as a spot-light’s palette. A collection of spotlight’s images was created and classified according to the proposed scale. The analysis of 788 pcs of existing lenses and reflectors with different LEDs and COBs carried out, and the dependence of the axial light intensity from beam angle was obtained. A transfer training of new deep convolutional neural network (CNN) based on the pre-trained GoogleNet was performed using this collection. GradCAM analysis showed that the trained network correctly identifies the features of objects. This work allows us to classify arbitrary spotlights with an accuracy of about 80 %. Thus, light designer can determine the class of spotlight and corresponding type of lens with its technical parameters using this new model based on CCN.


Sign in / Sign up

Export Citation Format

Share Document