omnidirectional camera
Recently Published Documents


TOTAL DOCUMENTS

219
(FIVE YEARS 41)

H-INDEX

18
(FIVE YEARS 2)

Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 4008
Author(s):  
Xuanrui Gong ◽  
Yaowen Lv ◽  
Xiping Xu ◽  
Yuxuan Wang ◽  
Mengdi Li

The omnidirectional camera, having the advantage of broadening the field of view, realizes 360° imaging in the horizontal direction. Due to light reflection from the mirror surface, the collinearity relation is altered and the imaged scene has severe nonlinear distortions. This makes it more difficult to estimate the pose of the omnidirectional camera. To solve this problem, we derive the mapping from omnidirectional camera to traditional camera and propose an omnidirectional camera linear imaging model. Based on the linear imaging model, we improve the EPnP algorithm to calculate the omnidirectional camera pose. To validate the proposed solution, we conducted simulations and physical experiments. Results show that the algorithm has a good performance in resisting noise.


2021 ◽  
Vol 13 (10) ◽  
pp. 1982
Author(s):  
Binhu Chai ◽  
Zhenzhong Wei

The mobile vision measurement system (MVMS) is widely used for location and attitude measurement in aircraft takeoff and landing, and its on-site global calibration is crucial to obtaining high-accuracy measurement aimed at obtaining the transformation relationship between the MVMS coordinate system and the local-tangent-plane coordinate system. In this paper, several new ideas are proposed to realize the global calibration of the MVMS effectively. First, the MVMS is regarded as azimuth and pitch measurement equipment with a virtual single image plane at focal length 1. Second, a new virtual omnidirectional camera model constructed by three mutual orthogonal image planes is put forward, which effectively resolves the problem of global calibration error magnification when the angle between the virtual single image plane and view axis of the system becomes small. Meanwhile, an expanded factorial linear method is proposed to solve the global calibration equations, which effectively restrains the influence of calibration data error. Experimental results with synthetic data verify the validity of the proposed method.


Author(s):  
Viet Dung Nguyen ◽  
Phuc Ngoc Pham ◽  
Xuan Bach Nguyen ◽  
Thi Men Tran ◽  
Minh Quan Nguyen

2021 ◽  
pp. 900-909
Author(s):  
Davide Scaramuzza

2020 ◽  
Vol 32 (6) ◽  
pp. 1193-1199
Author(s):  
Shunya Tanaka ◽  
◽  
Yuki Inoue

An omnidirectional camera can simultaneously capture all-round (360°) environmental information as well as the azimuth angle of a target object or person. By configuring a stereo camera set with two omnidirectional cameras, we can easily determine the azimuth angle of a target object or person per camera on the image information captured by the left and right cameras. A target person in an image can be localized by using a region-based convolutional neural network and the distance measured by the parallax in the combined azimuth angles.


Sign in / Sign up

Export Citation Format

Share Document