A computationally efficient affine-invariant feature for image matching with wide viewing angles

Optik ◽  
2021 ◽  
Vol 247 ◽  
pp. 167912
Author(s):  
Xiaomin Ma ◽  
Ye Yang ◽  
Yingmin Yi ◽  
Lei Zhu ◽  
Mian Dong
2012 ◽  
Vol 38 (4) ◽  
pp. 1023-1032 ◽  
Author(s):  
Liang Cheng ◽  
Manchun Li ◽  
Yongxue Liu ◽  
Wenting Cai ◽  
Yanming Chen ◽  
...  

2008 ◽  
Vol 5 (2) ◽  
pp. 246-250 ◽  
Author(s):  
Liang Cheng ◽  
Jianya Gong ◽  
Xiaoxia Yang ◽  
Chong Fan ◽  
Peng Han

PLoS ONE ◽  
2017 ◽  
Vol 12 (5) ◽  
pp. e0178090 ◽  
Author(s):  
Mingzhe Su ◽  
Yan Ma ◽  
Xiangfen Zhang ◽  
Yan Wang ◽  
Yuping Zhang

2016 ◽  
Vol 12 (12) ◽  
pp. 155014771668082
Author(s):  
Fanhuai Shi ◽  
Jian Gao ◽  
Xixia Huang

Visual sensor networks have emerged as an important class of sensor-based distributed intelligent systems, where image matching is one of the key technologies. This article presents an affine invariant method to produce dense correspondences between uncalibrated wide baseline images. Under affine transformations, both point location and its neighborhood texture are changed between views, so dense matching becomes a tough task. The proposed approach tends to solve this problem within a sparse-to-dense framework. The contribution of this article is in threefolds. First, a strategy of reliable sparse matching is proposed, which starts from affine invariant features extraction and matching and then these initial matches are utilized as spatial prior to produce more sparse matches. Second, match propagation from sparse feature points to its neighboring pixels is conducted in the way of region growing in an affine invariant framework. Third, the unmatched points are handled by low-rank matrix recovery technique. Comparison experiments of the proposed method versus existing ones show a significant improvement in the presence of large affine deformations.


Sign in / Sign up

Export Citation Format

Share Document