Age and Gender Classification Based on Deep Learning

2021 ◽  
pp. 425-437
Author(s):  
Tejas Agarwal ◽  
Mira Andhale ◽  
Anand Khule ◽  
Rushikesh Borse
2018 ◽  
Vol 275 ◽  
pp. 448-461 ◽  
Author(s):  
Mingxing Duan ◽  
Kenli Li ◽  
Canqun Yang ◽  
Keqin Li

2018 ◽  
Vol 31 (10) ◽  
pp. 5887-5900 ◽  
Author(s):  
Barjinder Kaur ◽  
Dinesh Singh ◽  
Partha Pratim Roy

Sensors ◽  
2020 ◽  
Vol 20 (8) ◽  
pp. 2424 ◽  
Author(s):  
Md Atiqur Rahman Ahad ◽  
Thanh Trung Ngo ◽  
Anindya Das Antar ◽  
Masud Ahmed ◽  
Tahera Hossain ◽  
...  

Wearable sensor-based systems and devices have been expanded in different application domains, especially in the healthcare arena. Automatic age and gender estimation has several important applications. Gait has been demonstrated as a profound motion cue for various applications. A gait-based age and gender estimation challenge was launched in the 12th IAPR International Conference on Biometrics (ICB), 2019. In this competition, 18 teams initially registered from 14 countries. The goal of this challenge was to find some smart approaches to deal with age and gender estimation from sensor-based gait data. For this purpose, we employed a large wearable sensor-based gait dataset, which has 745 subjects (357 females and 388 males), from 2 to 78 years old in the training dataset; and 58 subjects (19 females and 39 males) in the test dataset. It has several walking patterns. The gait data sequences were collected from three IMUZ sensors, which were placed on waist-belt or at the top of a backpack. There were 67 solutions from ten teams—for age and gender estimation. This paper extensively analyzes the methods and achieved-results from various approaches. Based on analysis, we found that deep learning-based solutions lead the competitions compared with conventional handcrafted methods. We found that the best result achieved 24.23% prediction error for gender estimation, and 5.39 mean absolute error for age estimation by employing angle embedded gait dynamic image and temporal convolution network.


Sign in / Sign up

Export Citation Format

Share Document