A Stereo Matching Method for Three-Dimensional Eye Localization of Autostereoscopic Display

2021 ◽  
pp. 28-43
Author(s):  
Bangpeng Xiao ◽  
Shenyuan Ye ◽  
Xicai Li ◽  
Min Li ◽  
Lingyu Zhang ◽  
...  
2013 ◽  
Vol 748 ◽  
pp. 624-628
Author(s):  
Zhu Lin Li

A gradation stereo matching algorithm based on edge feature points was proposed. Its basic idea is: firstly edge feature points of image pair were extracted; then, gradient invariability and singular eigenvalue invariability were analyzed, two-grade stereo matching method was build, foundation matrix was solved further, and three-grade stereo matching algorithm was finished by foundation matrix guidance. The result indicates that the algorithm can improve matching precision, from 58.3% to 73.2%, it is simple and utility, and it is important for object recognition, tracking, and three-dimensional reconstruction.


2004 ◽  
Vol 261-263 ◽  
pp. 1593-1598
Author(s):  
M. Tanaka ◽  
Y. Kimura ◽  
A. Kayama ◽  
L. Chouanine ◽  
Reiko Kato ◽  
...  

A computer program of the fractal analysis by the box-counting method was developed for the estimation of the fractal dimension of the three-dimensional fracture surface reconstructed by the stereo matching method. The image reconstruction and fractal analysis were then made on the fracture surfaces of materials created by different mechanisms. There was a correlation between the fractal dimension of the three-dimensional fracture surface and the fractal dimensions evaluated by other methods on ceramics and metals. The effects of microstructures on the fractal dimension were also experimentally discussed.


2003 ◽  
Vol 43 (9) ◽  
pp. 1453-1460 ◽  
Author(s):  
Manabu Tanaka ◽  
Yosuke Kimura ◽  
Lotfi Chouanine ◽  
Junnosuke Taguchi ◽  
Ryuichi Kato

2011 ◽  
Vol 5 (6) ◽  
pp. 924-931 ◽  
Author(s):  
Kenji Terabayashi ◽  
◽  
Yuma Hoshikawa ◽  
Alessandro Moro ◽  
Kazunori Umeda ◽  
...  

The combination of subtraction stereo with shadow detection we propose improves people tracking in stereoscopic environments. Subtraction stereo is a stereo matching method which is fast and robust for the correspondence problem – one of the most serious issues in computer vision – restricting the search range of matching to foreground regions. Shadow detection gives adequate foreground regions of tracked people by removing cast shadows. This leads to accurate three-dimensional measurement of positions in stereoscopic environment tracking. By focusing on disparity images obtained by subtraction stereo, we can detect people easily based on standard labeling. Objects can also be measured directly in size by subtraction stereo without geometric information about environments for tracking. This is important for installing the tracking system easily. To track multiple passing people, we use the extended Kalman filter to address the occlusion problem usually encountered in crowded environments. The proposed method is verified by experiments using unknown stereoscopic environments.


Sensors ◽  
2021 ◽  
Vol 21 (19) ◽  
pp. 6444
Author(s):  
Junhui Mei ◽  
Xiao Yang ◽  
Zhenxin Wang ◽  
Xiaobo Chen ◽  
Juntong Xi

In this paper, a topology-based stereo matching method for 3D measurement using a single pattern of coded spot-array structured light is proposed. The pattern of spot array is designed with a central reference ring spot, and each spot in the pattern can be uniquely coded with the row and column indexes according to the predefined topological search path. A method using rectangle templates to find the encoded spots in the captured images is proposed in the case where coding spots are missing, and an interpolation method is also proposed for rebuilding the missing spots. Experimental results demonstrate that the proposed technique could exactly and uniquely decode each spot and establish the stereo matching relation successfully, which can be used to obtain three-dimensional (3D) reconstruction with a single-shot method.


Author(s):  
Kun Liu ◽  
Changhe Zhou ◽  
Shaoqing Wang ◽  
Shengbin Wei ◽  
Jianyong Ma ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document