scholarly journals The Implementation of Deep Learning for White Blood Cell Subtype Classification from Microscopic Images

Author(s):  
Yustisia Amalia ◽  
Miftahul Khairoh ◽  
Arkha B. ◽  
Budi Santoso
2006 ◽  
Vol 53 (1) ◽  
pp. 133-139 ◽  
Author(s):  
Jeong A KIM ◽  
Youn Seon CHOI ◽  
Jeong Ik HONG ◽  
Su Hyun KIM ◽  
Hoe Hyun JUNG ◽  
...  

BME Frontiers ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
DongHun Ryu ◽  
Jinho Kim ◽  
Daejin Lim ◽  
Hyun-Seok Min ◽  
In Young Yoo ◽  
...  

Objective and Impact Statement. We propose a rapid and accurate blood cell identification method exploiting deep learning and label-free refractive index (RI) tomography. Our computational approach that fully utilizes tomographic information of bone marrow (BM) white blood cell (WBC) enables us to not only classify the blood cells with deep learning but also quantitatively study their morphological and biochemical properties for hematology research. Introduction. Conventional methods for examining blood cells, such as blood smear analysis by medical professionals and fluorescence-activated cell sorting, require significant time, costs, and domain knowledge that could affect test results. While label-free imaging techniques that use a specimen’s intrinsic contrast (e.g., multiphoton and Raman microscopy) have been used to characterize blood cells, their imaging procedures and instrumentations are relatively time-consuming and complex. Methods. The RI tomograms of the BM WBCs are acquired via Mach-Zehnder interferometer-based tomographic microscope and classified by a 3D convolutional neural network. We test our deep learning classifier for the four types of bone marrow WBC collected from healthy donors (n=10): monocyte, myelocyte, B lymphocyte, and T lymphocyte. The quantitative parameters of WBC are directly obtained from the tomograms. Results. Our results show >99% accuracy for the binary classification of myeloids and lymphoids and >96% accuracy for the four-type classification of B and T lymphocytes, monocyte, and myelocytes. The feature learning capability of our approach is visualized via an unsupervised dimension reduction technique. Conclusion. We envision that the proposed cell classification framework can be easily integrated into existing blood cell investigation workflows, providing cost-effective and rapid diagnosis for hematologic malignancy.


2019 ◽  
Vol 25 (5) ◽  
pp. 63-68 ◽  
Author(s):  
Mesut Togacar ◽  
Burhan Ergen ◽  
Mehmet Emre Sertkaya

The white blood cells produced in the bone marrow and lymphoid tissue known as leucocytes are an important part of the immune system to protect the body against foreign invaders and infectious disease. These cells, which do not have color, have a few days or several weeks of life. A lot of clinic experience is required for a doctor to detect the amount of white blood cells in human blood and classify it. Thus, early and accurate diagnosis can be made in the formation of various disease types, including infection on the immune system, such as anemia and leukemia, while evaluating and determining the disease of a patient. The white blood cells can be separated into four subclasses, such as Eosinophil, Lymphocyte, Monocyte, and Neutrophil. This study focuses on the separation of the white blood cell images by the classification process using convolutional neural network models, which is a deep learning model. A deep learning network, which is slow in the training step due to the complex architecture, but fast in the test step, is used for the feature extraction instead of intricate methods. For the subclass separation of white blood cells, the experimental results show that the AlexNet architecture gives the correct recognition rate among the convolutional neural network architectures tested in the study. Various classifiers are performed on the features derived from the AlexNet architecture to evaluate the classification performance. The best performance in the classification of white blood cells is given by the quadratic discriminant analysis classifier with the accuracy of 97.78 %.


Sign in / Sign up

Export Citation Format

Share Document