scholarly journals Fully Convolutional Neural Network With GRU for 3D Braided Composite Material Flaw Detection

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 151180-151188 ◽  
Author(s):  
Yongmin Guo ◽  
Zhitao Xiao ◽  
Lei Geng ◽  
Jun Wu ◽  
Fang Zhang ◽  
...  
2010 ◽  
Vol 146-147 ◽  
pp. 394-399 ◽  
Author(s):  
Xiao Li He ◽  
Chong Liu

A method used to recognize the inner defects of 3-D braided composite materials is discussed. Firstly, the link between UT signals and the defects of 3-D braided composite material is analyzed. Then, the wavelet packet transform is used to process the ultrasonic scanning pulse signals of the defects. The characteristic quantities of signal are extracted into the BP neural network as samples. Through training the BP neural network, the recognize system of micro-cracks and pores is achieved. Finally, according to the results of experiment this classification system based on wavelet packet transform is proved to be feasible.


2021 ◽  
Vol 63 (3) ◽  
pp. 141-145
Author(s):  
M Mirzapour ◽  
A Movafeghi ◽  
E Yahaghi

Non-destructive confirmation of seamless welding is of critical importance in most applications and digital industrial radiography (DIR) is often the method of choice for internal flaw detection. DIR images often suffer from fogginess, limiting the inspection of flawed regions in online and quantitative applications. Much focus has therefore been put on denoising and image fog removal to yield better outcomes. One of the methods most widely used to improve the image is the fast and flexible denoising convolutional neural network (FFCN). This method has been shown to offer excellent image quality performance combined with fast execution and computing efficiency. In this study, the FFCN image processing technique is implemented and applied to radiographic images of welded objects. Enhancement of defect detection is achieved through sharpening of the image feature edges, leading to improved quantification in weld flaw sizing. The method is applied to the radiographic images using the weighted subtraction method. Experienced radiographers find that the weld defect detail is better visualised with output images from the FFCN algorithm compared to the original radiographs. Improvement in weld flaw size quantification is evaluated using test objects and the distance between the first two lines of the image quality indicator (IQI). The results show that the applied algorithm enhances the visualisation of internal defects and increases the detectability of fine fractures in the welded region. It is also found that, by selective image contrast enhancement near the flaw edges, flaw size quantification is improved significantly. The algorithm is found to be efficient, enabling online automated implementation on standard personal computers.


2020 ◽  
Author(s):  
S Kashin ◽  
D Zavyalov ◽  
A Rusakov ◽  
V Khryashchev ◽  
A Lebedev

2020 ◽  
Vol 2020 (10) ◽  
pp. 181-1-181-7
Author(s):  
Takahiro Kudo ◽  
Takanori Fujisawa ◽  
Takuro Yamaguchi ◽  
Masaaki Ikehara

Image deconvolution has been an important issue recently. It has two kinds of approaches: non-blind and blind. Non-blind deconvolution is a classic problem of image deblurring, which assumes that the PSF is known and does not change universally in space. Recently, Convolutional Neural Network (CNN) has been used for non-blind deconvolution. Though CNNs can deal with complex changes for unknown images, some CNN-based conventional methods can only handle small PSFs and does not consider the use of large PSFs in the real world. In this paper we propose a non-blind deconvolution framework based on a CNN that can remove large scale ringing in a deblurred image. Our method has three key points. The first is that our network architecture is able to preserve both large and small features in the image. The second is that the training dataset is created to preserve the details. The third is that we extend the images to minimize the effects of large ringing on the image borders. In our experiments, we used three kinds of large PSFs and were able to observe high-precision results from our method both quantitatively and qualitatively.


Sign in / Sign up

Export Citation Format

Share Document