Detecting brain tumors using deep learning convolutional neural network with transfer learning approach

Author(s):  
Sadia Anjum ◽  
Lal Hussain ◽  
Mushtaq Ali ◽  
Monagi H. Alkinani ◽  
Wajid Aziz ◽  
...  
2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Young-Gon Kim ◽  
Sungchul Kim ◽  
Cristina Eunbee Cho ◽  
In Hye Song ◽  
Hee Jin Lee ◽  
...  

AbstractFast and accurate confirmation of metastasis on the frozen tissue section of intraoperative sentinel lymph node biopsy is an essential tool for critical surgical decisions. However, accurate diagnosis by pathologists is difficult within the time limitations. Training a robust and accurate deep learning model is also difficult owing to the limited number of frozen datasets with high quality labels. To overcome these issues, we validated the effectiveness of transfer learning from CAMELYON16 to improve performance of the convolutional neural network (CNN)-based classification model on our frozen dataset (N = 297) from Asan Medical Center (AMC). Among the 297 whole slide images (WSIs), 157 and 40 WSIs were used to train deep learning models with different dataset ratios at 2, 4, 8, 20, 40, and 100%. The remaining, i.e., 100 WSIs, were used to validate model performance in terms of patch- and slide-level classification. An additional 228 WSIs from Seoul National University Bundang Hospital (SNUBH) were used as an external validation. Three initial weights, i.e., scratch-based (random initialization), ImageNet-based, and CAMELYON16-based models were used to validate their effectiveness in external validation. In the patch-level classification results on the AMC dataset, CAMELYON16-based models trained with a small dataset (up to 40%, i.e., 62 WSIs) showed a significantly higher area under the curve (AUC) of 0.929 than those of the scratch- and ImageNet-based models at 0.897 and 0.919, respectively, while CAMELYON16-based and ImageNet-based models trained with 100% of the training dataset showed comparable AUCs at 0.944 and 0.943, respectively. For the external validation, CAMELYON16-based models showed higher AUCs than those of the scratch- and ImageNet-based models. Model performance for slide feasibility of the transfer learning to enhance model performance was validated in the case of frozen section datasets with limited numbers.


2021 ◽  
pp. 20201263
Author(s):  
Mohammad Salehi ◽  
Reza Mohammadi ◽  
Hamed Ghaffari ◽  
Nahid Sadighi ◽  
Reza Reiazi

Objective: Pneumonia is a lung infection and causes the inflammation of the small air sacs (Alveoli) in one or both lungs. Proper and faster diagnosis of pneumonia at an early stage is imperative for optimal patient care. Currently, chest X-ray is considered as the best imaging modality for diagnosing pneumonia. However, the interpretation of chest X-ray images is challenging. To this end, we aimed to use an automated convolutional neural network-based transfer-learning approach to detect pneumonia in paediatric chest radiographs. Methods: Herein, an automated convolutional neural network-based transfer-learning approach using four different pre-trained models (i.e. VGG19, DenseNet121, Xception, and ResNet50) was applied to detect pneumonia in children (1–5 years) chest X-ray images. The performance of different proposed models for testing data set was evaluated using five performances metrics, including accuracy, sensitivity/recall, Precision, area under curve, and F1 score. Results: All proposed models provide accuracy greater than 83.0% for binary classification. The pre-trained DenseNet121 model provides the highest classification performance of automated pneumonia classification with 86.8% accuracy, followed by Xception model with an accuracy of 86.0%. The sensitivity of the proposed models was greater than 91.0%. The Xception and DenseNet121 models achieve the highest classification performance with F1-score greater than 89.0%. The plotted area under curve of receiver operating characteristics of VGG19, Xception, ResNet50, and DenseNet121 models are 0.78, 0.81, 0.81, and 0.86, respectively. Conclusion: Our data showed that the proposed models achieve a high accuracy for binary classification. Transfer learning was used to accelerate training of the proposed models and resolve the problem associated with insufficient data. We hope that these proposed models can help radiologists for a quick diagnosis of pneumonia at radiology departments. Moreover, our proposed models may be useful to detect other chest-related diseases such as novel Coronavirus 2019. Advances in knowledge: Herein, we used transfer learning as a machine learning approach to accelerate training of the proposed models and resolve the problem associated with insufficient data. Our proposed models achieved accuracy greater than 83.0% for binary classification.


2019 ◽  
Vol 34 (11) ◽  
pp. 4924-4931 ◽  
Author(s):  
Daichi Kitaguchi ◽  
Nobuyoshi Takeshita ◽  
Hiroki Matsuzaki ◽  
Hiroaki Takano ◽  
Yohei Owada ◽  
...  

2018 ◽  
Vol 132 ◽  
pp. 679-688 ◽  
Author(s):  
Sakshi Indolia ◽  
Anil Kumar Goswami ◽  
S.P. Mishra ◽  
Pooja Asopa

2021 ◽  
Author(s):  
Ghassan Mohammed Halawani

The main purpose of this project is to modify a convolutional neural network for image classification, based on a deep-learning framework. A transfer learning technique is used by the MATLAB interface to Alex-Net to train and modify the parameters in the last two fully connected layers of Alex-Net with a new dataset to perform classifications of thousands of images. First, the general common architecture of most neural networks and their benefits are presented. The mathematical models and the role of each part in the neural network are explained in detail. Second, different neural networks are studied in terms of architecture, application, and the working method to highlight the strengths and weaknesses of each of neural network. The final part conducts a detailed study on one of the most powerful deep-learning networks in image classification – i.e. the convolutional neural network – and how it can be modified to suit different classification tasks by using transfer learning technique in MATLAB.


2021 ◽  
Vol 290 ◽  
pp. 02020
Author(s):  
Boyu Zhang ◽  
Xiao Wang ◽  
Shudong Li ◽  
Jinghua Yang

Current underwater shipwreck side scan sonar samples are few and difficult to label. With small sample sizes, their image recognition accuracy with a convolutional neural network model is low. In this study, we proposed an image recognition method for shipwreck side scan sonar that combines transfer learning with deep learning. In the non-transfer learning, shipwreck sonar sample data were used to train the network, and the results were saved as the control group. The weakly correlated data were applied to train the network, then the network parameters were transferred to the new network, and then the shipwreck sonar data was used for training. These steps were repeated using strongly correlated data. Experiments were carried out on Lenet-5, AlexNet, GoogLeNet, ResNet and VGG networks. Without transfer learning, the highest accuracy was obtained on the ResNet network (86.27%). Using weakly correlated data for transfer training, the highest accuracy was on the VGG network (92.16%). Using strongly correlated data for transfer training, the highest accuracy was also on the VGG network (98.04%). In all network architectures, transfer learning improved the correct recognition rate of convolutional neural network models. Experiments show that transfer learning combined with deep learning improves the accuracy and generalization of the convolutional neural network in the case of small sample sizes.


Sign in / Sign up

Export Citation Format

Share Document