Sensing and Control of Weld Pool Geometry for Automated GTA Welding

1995 ◽  
Vol 117 (2) ◽  
pp. 210-222 ◽  
Author(s):  
R. Kovacevic ◽  
Y. M. Zhang ◽  
S. Ruan

Weld pool geometry is a crucial factor in determining welding quality, especially in the case of sheet welding. Its feedback control should be a fundamental requirement for automated welding. However, the real-time precise measurement of pool geometry is a difficult procedure. It has been shown that vision sensing is a promising approach for monitoring the weld pool geometry. Quality images that can be processed in real-time to detect the pool geometry are acquired by using a high shutter speed camera assisted with nitrogen laser as an illumination source. However, during practical welding, impurities or oxides existing on the pool surface complicate image processing. The image features are analyzed and utilized for effectively processing the image. It is shown that the proposed algorithm can always detect the pool boundary with sufficient accuracy in less than 100 ms. Based on this measuring technique, a robust adaptive system has been developed to control the pool area. Experiments show that the proposed control system can overcome the influence caused by various disturbances.

2019 ◽  
Vol 98 (12) ◽  
pp. 379s-386s
Author(s):  
ZONGYAO CHEN ◽  
◽  
JIAN CHEN ◽  
ZHILI FENG

Monitoring weld pool geometry without the appropriate auxiliary light source remains challenging due to the interference from the intense arc light. In this work, a new software framework was developed to measure the key features related to welding pool three-dimensional (3D) geometry based on the two-dimensional (2D) passive vision images. It was found that the interference of the arc light on the weld pool image can be effectively controlled by adjusting the camera exposure time based on the decision made from machine learning classifier. Weld pool width, trailing length, and surface height (SH) were calculated in real time, and the result agreed with the measurement of the weld bead geometry. The method presented here established the foundation for real-time penetration monitoring and control.


2018 ◽  
Vol 256 ◽  
pp. 57-68 ◽  
Author(s):  
J.K. Huang ◽  
M.H. Yang ◽  
J.S. Chen ◽  
F.Q. Yang ◽  
Y.M. Zhang ◽  
...  

2005 ◽  
Vol 38 (1) ◽  
pp. 301-306
Author(s):  
Wei Lu ◽  
Yu-Ming Zhang ◽  
Chuan Zhang ◽  
Bruce L. Walcott

2020 ◽  
Vol 39 (9) ◽  
pp. 1122-1137
Author(s):  
Dejun Guo ◽  
Kam K Leang

This article focuses on enabling an aerial robot to fly through multiple openings at high speed using image-based estimation, planning, and control. State-of-the-art approaches assume that the robot’s global translational variables (e.g., position and velocity) can either be measured directly with external localization sensors or estimated onboard. Unfortunately, estimating the translational variables may be impractical because modeling errors and sensor noise can lead to poor performance. Furthermore, monocular-camera-based pose estimation techniques typically require a model of the gap (window) in order to handle the unknown scale. Herein, a new scheme for image-based estimation, aggressive-maneuvering trajectory generation, and motion control is developed for multi-rotor aerial robots. The approach described does not rely on measurement of the translational variables and does not require the model of the gap or window. First, the robot dynamics are expressed in terms of the image features that are invariant to rotation (invariant features). This step decouples the robot’s attitude and keeps the invariant features in the flat output space of the differentially flat system. Second, an optimal trajectory is efficiently generated in real time to obtain the dynamically-feasible trajectory for the invariant features. Finally, a controller is designed to enable real-time, image-based tracking of the trajectory. The performance of the estimation, planning, and control scheme is validated in simulations and through 80 successful experimental trials. Results show the ability to successfully fly through two narrow openings, where the estimation and planning computation and motion control from one opening to the next are performed in real time on the robot.


2015 ◽  
Author(s):  
C. Stolz ◽  
N. Coniglio ◽  
A. Mathieu ◽  
O. Aubreton

Sign in / Sign up

Export Citation Format

Share Document