Comparison of Current Convolutional Neural Network Architectures for Classification of Damaged and Undamaged Cars

Author(s):  
Yavuz Ünal ◽  
Şaban Öztürk ◽  
Muhammed Nuri Dudak ◽  
Mehmet Ekici
2021 ◽  
Author(s):  
Ananda Ananda ◽  
Kwun Ho Ngan ◽  
Cefa Karabag ◽  
Eduardo Alonso ◽  
Alex Ter-Sarkisov ◽  
...  

This paper investigates the classification of radiographic images with eleven convolutional neural network (CNN) architectures (GoogleNet, VGG-19, AlexNet, SqueezeNet, ResNet-18, Inception-v3, ResNet-50, VGG-16, ResNet-101, DenseNet-201 and Inception-ResNet-v2). The CNNs were used to classify a series of wrist radiographs from the Stanford Musculoskeletal Radiographs (MURA) dataset into two classes - normal and abnormal. The architectures were compared for different hyper-parameters against accuracy and Cohen's kappa coefficient. The best two results were then explored with data augmentation. Without the use of augmentation, the best results were provided by Inception-Resnet-v2 (Mean accuracy = 0.723, Mean kappa = 0.506). These were significantly improved with augmentation to Inception-Resnet-v2 (Mean accuracy = 0.857, Mean kappa = 0.703). Finally, Class Activation Mapping was applied to interpret activation of the network against the location of an anomaly in the radiographs.


Sensors ◽  
2021 ◽  
Vol 21 (16) ◽  
pp. 5381
Author(s):  
Ananda Ananda ◽  
Kwun Ho Ngan ◽  
Cefa Karabağ ◽  
Aram Ter-Sarkisov ◽  
Eduardo Alonso ◽  
...  

This paper investigates the classification of radiographic images with eleven convolutional neural network (CNN) architectures (GoogleNet, VGG-19, AlexNet, SqueezeNet, ResNet-18, Inception-v3, ResNet-50, VGG-16, ResNet-101, DenseNet-201 and Inception-ResNet-v2). The CNNs were used to classify a series of wrist radiographs from the Stanford Musculoskeletal Radiographs (MURA) dataset into two classes—normal and abnormal. The architectures were compared for different hyper-parameters against accuracy and Cohen’s kappa coefficient. The best two results were then explored with data augmentation. Without the use of augmentation, the best results were provided by Inception-ResNet-v2 (Mean accuracy = 0.723, Mean kappa = 0.506). These were significantly improved with augmentation to Inception-ResNet-v2 (Mean accuracy = 0.857, Mean kappa = 0.703). Finally, Class Activation Mapping was applied to interpret activation of the network against the location of an anomaly in the radiographs.


Author(s):  
Hannah Sofian ◽  
Joel Than Chia Ming ◽  
Suraya Muhammad ◽  
Norliza Mohd Noor

<p>Cardiovascular disease is the highest leading to death for Non-Communicable disease. Coronary artery calcification disease is part of cardiovascular disease. The built-in of the plaques and the calcification in the coronary artery inner wall make the blood vessel cross-section area narrow. The standard practice by the radiologists and medical clinical are by visual inspection to detect the calcification in the intravascular ultrasound image. Deep learning is the current image processing methods that have high potential to detect calcification analysis using convolutional neural network architecture and classifiers. To detect the absence of calcification and presence calcification on the intravascular ultrasound image, using k-fold =10, we compared the three types of convolutional neural network architectures and the seven types of classifiers with the provided ground truth from MICCAI 2011. We used two types of images named as Cartesian Coordinates image and polar reconstructed coordinate image. The classifiers such as Support Vector Machine, Discriminant analysis, Ensembles and Error-Correcting Output Codes obtained the perfect result with value one for Area Under Curve and all the performance measure result, accuracy, sensitivity, specificity, positive predictive value and negative predictive value. Area Under Curve for Naïve Bayes classifier is 0.9967 and for Decision Tree classifier is 0.9994, obtained using the polar reconstructed coordinate image for InceptionresNet-V2 architecture.</p>


Sign in / Sign up

Export Citation Format

Share Document