scholarly journals Comparative analysis of augmented datasets performances of age invariant face recognition models

2021 ◽  
Vol 10 (3) ◽  
pp. 1356-1367
Author(s):  
Kennedy Okokpujie ◽  
Etinosa Noma-Osaghae ◽  
Samuel Ndueso John ◽  
Charles Ndujiuba ◽  
Imhade Princess Okokpujie

The popularity of face recognition systems has increased due to their non-invasive method of image acquisition, thus boasting the widespread applications. Face ageing is one major factor that influences the performance of face recognition algorithms. In this study, the authors present a comparative study of the two most accepted and experimented face ageing datasets (FG-Net and morph II). These datasets were used to simulate age invariant face recognition (AIFR) models. Four types of noises were added to the two face ageing datasets at the preprocessing stage. The addition of noise at the preprocessing stage served as a data augmentation technique that increased the number of sample images available for deep convolutional neural network (DCNN) experimentation, improved the proposed AIFR model and the trait aging features extraction process. The proposed AIFR models are developed with the pre-trained Inception-ResNet-v2 deep convolutional neural network architecture. On testing and comparing the models, the results revealed that FG-Net is more efficient over Morph with an accuracy of 0.15%, loss function of 71%, mean square error (MSE) of 39% and mean absolute error (MAE) of -0.63%.

2021 ◽  
pp. 1-10
Author(s):  
Gayatri Pattnaik ◽  
Vimal K. Shrivastava ◽  
K. Parvathi

Pests are major threat to economic growth of a country. Application of pesticide is the easiest way to control the pest infection. However, excessive utilization of pesticide is hazardous to environment. The recent advances in deep learning have paved the way for early detection and improved classification of pest in tomato plants which will benefit the farmers. This paper presents a comprehensive analysis of 11 state-of-the-art deep convolutional neural network (CNN) models with three configurations: transfers learning, fine-tuning and scratch learning. The training in transfer learning and fine tuning initiates from pre-trained weights whereas random weights are used in case of scratch learning. In addition, the concept of data augmentation has been explored to improve the performance. Our dataset consists of 859 tomato pest images from 10 categories. The results demonstrate that the highest classification accuracy of 94.87% has been achieved in the transfer learning approach by DenseNet201 model with data augmentation.


2020 ◽  
Vol 12 (6) ◽  
pp. 1015 ◽  
Author(s):  
Kan Zeng ◽  
Yixiao Wang

Classification algorithms for automatically detecting sea surface oil spills from spaceborne Synthetic Aperture Radars (SARs) can usually be regarded as part of a three-step processing framework, which briefly includes image segmentation, feature extraction, and target classification. A Deep Convolutional Neural Network (DCNN), named the Oil Spill Convolutional Network (OSCNet), is proposed in this paper for SAR oil spill detection, which can do the latter two steps of the three-step processing framework. Based on VGG-16, the OSCNet is obtained by designing the architecture and adjusting hyperparameters with the data set of SAR dark patches. With the help of the big data set containing more than 20,000 SAR dark patches and data augmentation, the OSCNet can have as many as 12 weight layers. It is a relatively deep Deep Learning (DL) network for SAR oil spill detection. It is shown by the experiments based on the same data set that the classification performance of OSCNet has been significantly improved compared to that of traditional machine learning (ML). The accuracy, recall, and precision are improved from 92.50%, 81.40%, and 80.95% to 94.01%, 83.51%, and 85.70%, respectively. An important reason for this improvement is that the distinguishability of the features learned by OSCNet itself from the data set is significantly higher than that of the hand-crafted features needed by traditional ML algorithms. In addition, experiments show that data augmentation plays an important role in avoiding over-fitting and hence improves the classification performance. OSCNet has also been compared with other DL classifiers for SAR oil spill detection. Due to the huge differences in the data sets, only their similarities and differences are discussed at the principle level.


Symmetry ◽  
2019 ◽  
Vol 11 (2) ◽  
pp. 256 ◽  
Author(s):  
Jiangyong An ◽  
Wanyi Li ◽  
Maosong Li ◽  
Sanrong Cui ◽  
Huanran Yue

Drought stress seriously affects crop growth, development, and grain production. Existing machine learning methods have achieved great progress in drought stress detection and diagnosis. However, such methods are based on a hand-crafted feature extraction process, and the accuracy has much room to improve. In this paper, we propose the use of a deep convolutional neural network (DCNN) to identify and classify maize drought stress. Field drought stress experiments were conducted in 2014. The experiment was divided into three treatments: optimum moisture, light drought, and moderate drought stress. Maize images were obtained every two hours throughout the whole day by digital cameras. In order to compare the accuracy of DCNN, a comparative experiment was conducted using traditional machine learning on the same dataset. The experimental results demonstrated an impressive performance of the proposed method. For the total dataset, the accuracy of the identification and classification of drought stress was 98.14% and 95.95%, respectively. High accuracy was also achieved on the sub-datasets of the seedling and jointing stages. The identification and classification accuracy levels of the color images were higher than those of the gray images. Furthermore, the comparison experiments on the same dataset demonstrated that DCNN achieved a better performance than the traditional machine learning method (Gradient Boosting Decision Tree GBDT). Overall, our proposed deep learning-based approach is a very promising method for field maize drought identification and classification based on digital images.


Author(s):  
Kun Xu ◽  
Shunming Li ◽  
Jinrui Wang ◽  
Zenghui An ◽  
Yu Xin

Deep learning method is gradually applied in the field of mechanical equipment fault diagnosis because it can learn complex and useful features automatically from the vibration signals. Among the many intelligent diagnostic models, convolutional neural network has been gradually applied to intelligent fault diagnosis of bearings due to its advantages of local connection and weight sharing. However, there are still some drawbacks. (1) The training process of convolutional neural network is slow and unstable. It has more training parameters. (2) It cannot perform well under different working conditions, such as noisy environment and different workloads. In this paper, a novel model named adaptive and fast convolutional neural network with wide receptive field is presented to overcome the aforementioned deficiencies. The prime innovations include the following. First, a deep convolutional neural network architecture is constructed using the scaled exponential linear unit activation function and global average pooling. The model has fewer training parameters and can converge rapidly and stably. Second, the model has a wide receptive field with two medium and three small length convolutional kernels. It also has high diagnostic accuracy and robustness when the environment is noisy and workloads are changed compared with other models. Furthermore, to demonstrate how the wide receptive field convolutional neural network model works, the reasons for high model performance are analyzed and the learned features are also visualized. Finally, the wide receptive field convolutional neural network model is verified by the vibration dataset collected in the background of high noise, and the results indicate that it has high diagnostic performance.


2020 ◽  
Vol 7 ◽  
Author(s):  
Hayden Gunraj ◽  
Linda Wang ◽  
Alexander Wong

The coronavirus disease 2019 (COVID-19) pandemic continues to have a tremendous impact on patients and healthcare systems around the world. In the fight against this novel disease, there is a pressing need for rapid and effective screening tools to identify patients infected with COVID-19, and to this end CT imaging has been proposed as one of the key screening methods which may be used as a complement to RT-PCR testing, particularly in situations where patients undergo routine CT scans for non-COVID-19 related reasons, patients have worsening respiratory status or developing complications that require expedited care, or patients are suspected to be COVID-19-positive but have negative RT-PCR test results. Early studies on CT-based screening have reported abnormalities in chest CT images which are characteristic of COVID-19 infection, but these abnormalities may be difficult to distinguish from abnormalities caused by other lung conditions. Motivated by this, in this study we introduce COVIDNet-CT, a deep convolutional neural network architecture that is tailored for detection of COVID-19 cases from chest CT images via a machine-driven design exploration approach. Additionally, we introduce COVIDx-CT, a benchmark CT image dataset derived from CT imaging data collected by the China National Center for Bioinformation comprising 104,009 images across 1,489 patient cases. Furthermore, in the interest of reliability and transparency, we leverage an explainability-driven performance validation strategy to investigate the decision-making behavior of COVIDNet-CT, and in doing so ensure that COVIDNet-CT makes predictions based on relevant indicators in CT images. Both COVIDNet-CT and the COVIDx-CT dataset are available to the general public in an open-source and open access manner as part of the COVID-Net initiative. While COVIDNet-CT is not yet a production-ready screening solution, we hope that releasing the model and dataset will encourage researchers, clinicians, and citizen data scientists alike to leverage and build upon them.


Sign in / Sign up

Export Citation Format

Share Document