Calibration Method Based on General Imaging Model for Micro-Object Measurement System

2016 ◽  
Vol 36 (9) ◽  
pp. 0912003
Author(s):  
孔玮琦 Kong Weiqi ◽  
刘京南 Liu Jingnan ◽  
达飞鹏 Da Feipeng ◽  
饶立 Rao Li
2012 ◽  
Vol 588-589 ◽  
pp. 1337-1340
Author(s):  
Y.X. Zhu ◽  
X.S. Duan

For the pose measurement of cannon barrel, a vision method through checked plane had been proposed. To test and improve the precision of this new method without considering the hardware error and some other inextricable objective factors,derive the imaging model of the marker (checked plane) from motion model of cannon barrel and the position relative to it using variable-controlling method. Establish the computer simulation platform of vision measurement system for cannon barrel pose based on C++ Builder. The simulation experiment validate the veracity and dependability of this method.


1999 ◽  
Author(s):  
Chunhe Gong ◽  
Jingxia Yuan ◽  
Jun Ni

Abstract Robot calibration plays an increasingly important role in manufacturing. For robot calibration on the manufacturing floor, it is desirable that the calibration technique be easy and convenient to implement. This paper presents a new self-calibration method to calibrate and compensate for robot system kinematic errors. Compared with the traditional calibration methods, this calibration method has several unique features. First, it is not necessary to apply an external measurement system to measure the robot end-effector position for the purpose of kinematic identification since the robot measurement system has a sensor as its integral part. Second, this self-calibration is based on distance measurement rather than absolute position measurement for kinematic identification; therefore the calibration of the transformation from the world coordinate system to the robot base coordinate system, known as base calibration, is not necessary. These features not only greatly facilitate the robot system calibration but also shorten the error propagation chain, therefore, increase the accuracy of parameter estimation. An integrated calibration system is designed to validate the effectiveness of this calibration method. Experimental results show that after calibration there is a significant improvement of robot accuracy over a typical robot workspace.


2018 ◽  
Vol 38 (12) ◽  
pp. 1215002 ◽  
Author(s):  
吴庆华 Wu Qinghua ◽  
陈慧 Chen Hui ◽  
朱思斯 Zhu Sisi ◽  
周阳 Zhou Yang ◽  
万偲 Wan Cai

2019 ◽  
Vol 39 (6) ◽  
pp. 0612003
Author(s):  
马冬晓 Dongxiao Ma ◽  
汪家春 Jiachun Wang ◽  
陈宗胜 Zongsheng Chen ◽  
王冰 Bing Wang ◽  
刘洋 Yang Liu

2019 ◽  
Vol 46 (1) ◽  
pp. 0104003
Author(s):  
马国庆 Ma Guoqing ◽  
刘丽 Liu Li ◽  
于正林 Yu Zhenglin ◽  
曹国华 Cao Guohua ◽  
王强 Wang Qiang

2014 ◽  
Vol 644-650 ◽  
pp. 1234-1239
Author(s):  
Tao He ◽  
Yu Lang Xie ◽  
Cai Sheng Zhu ◽  
Jiu Yin Chen

This template explains and demonstrates how to design a measurement system based on the size of the linear structured light vision, the system could works at realized the high precision and fast measurement of the size of mechanical parts, and accurate calibration of the system. First of all, this paper set up the experimental platform based on linear structured light vision measurement. Secondly, this paper established a system of measurement model, and puts forward a new method of calibration of structured light sensor and set up the mathematical model of sensor calibration. This calibration method only need to use some gage blocks of high precision as the target, the target position need not have a strict requirements, and the solving process will be more convenient, much easier to field use and maintenance. Finally, measuring accuracy on the system by gage blocks with high precision is verified, the experiment shows that measurement accuracy within 0.050 mmin the depth of 0-80 - mm range. This system can satisfy the demands of precision testing of most industrial parts .with its simple calibration process and high precision, it is suitable for the structured light vision calibration.


Sign in / Sign up

Export Citation Format

Share Document