Sensors ◽  
2021 ◽  
Vol 21 (5) ◽  
pp. 1841
Author(s):  
Leyuan Liu ◽  
Zeran Ke ◽  
Jiao Huo ◽  
Jingying Chen

Mainstream methods treat head pose estimation as a supervised classification/regression problem, whose performance heavily depends on the accuracy of ground-truth labels of training data. However, it is rather difficult to obtain accurate head pose labels in practice, due to the lack of effective equipment and reasonable approaches for head pose labeling. In this paper, we propose a method which does not need to be trained with head pose labels, but matches the keypoints between a reconstructed 3D face model and the 2D input image, for head pose estimation. The proposed head pose estimation method consists of two components: the 3D face reconstruction and the 3D–2D matching keypoints. At the 3D face reconstruction phase, a personalized 3D face model is reconstructed from the input head image using convolutional neural networks, which are jointly optimized by an asymmetric Euclidean loss and a keypoint loss. At the 3D–2D keypoints matching phase, an iterative optimization algorithm is proposed to match the keypoints between the reconstructed 3D face model and the 2D input image efficiently under the constraint of perspective transformation. The proposed method is extensively evaluated on five widely used head pose estimation datasets, including Pointing’04, BIWI, AFLW2000, Multi-PIE, and Pandora. The experimental results demonstrate that the proposed method achieves excellent cross-dataset performance and surpasses most of the existing state-of-the-art approaches, with average MAEs of 4.78∘ on Pointing’04, 6.83∘ on BIWI, 7.05∘ on AFLW2000, 5.47∘ on Multi-PIE, and 5.06∘ on Pandora, although the model of the proposed method is not trained on any of these five datasets.


Author(s):  
Xingjuan Cai ◽  
Yihao Cao ◽  
Yeqing Ren ◽  
Zhihua Cui ◽  
Wensheng Zhang

2018 ◽  
Vol 37 (2) ◽  
pp. 523-550 ◽  
Author(s):  
M. Zollhöfer ◽  
J. Thies ◽  
P. Garrido ◽  
D. Bradley ◽  
T. Beeler ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document