SHAPR Predicts 3D Cell Shapes From 2D Microscopic Images

2022 ◽  
Author(s):  
Dominik Jens Elias Waibel ◽  
Niklas Kiermeyer ◽  
Scott Atwell ◽  
Ario Sadafi ◽  
Matthias Meier ◽  
...  
2021 ◽  
Author(s):  
Dominik Waibel ◽  
Niklas Kiermeyer ◽  
Scott Atwell ◽  
Ario Sadafi ◽  
Matthias Meier ◽  
...  

Reconstruction of shapes, forms, and sizes of three-dimensional (3D) objects from two-dimensional (2D) information is one of the most complex functions of the human brain. It also poses an algorithmic challenge and at present is a widely studied subject in computer vision. We here focus on the single cell level and present a neural network-based SHApe PRediction autoencoder SHAPR that accurately reconstructs 3D cellular and nuclear shapes from 2D microscopic images and may have great potential for application in the biomedical sciences.


2019 ◽  
Author(s):  
Dhananjay Bhaskar ◽  
Darrick Lee ◽  
Hildur Knútsdóttir ◽  
Cindy Tan ◽  
MoHan Zhang ◽  
...  

AbstractCell morphology is an important indicator of cell state, function, stage of development, and fate in both normal and pathological conditions. Cell shape is among key indicators used by pathologists to identify abnormalities or malignancies. With rapid advancements in the speed and amount of biological data acquisition, including images and movies of cells, computer-assisted identification and analysis of images becomes essential. Here, we report on techniques for recognition of cells in microscopic images and automated cell shape classification. We illustrate how our unsupervised machine-learning-based approach can be used to classify distinct cell shapes from a large number of microscopic images.Technical AbstractWe develop a methodology to segment cells from microscopy images and compute quantitative descriptors that characterize their morphology. Using unsupervised techniques for dimensionality reduction and density-based clustering, we perform label-free cell shape classification. Cells are identified with minimal user input using mathematical morphology and region-growing segmentation methods. Physical quantities describing cell shape and size (including area, perimeter, Feret diameters, etc.) are computed along with other features including shape factors and Hu’s image moments.Correlated features are combined to obtain a low-dimensional (2-D or 3-D) embedding of data points corresponding to individual segmented cell shapes. Finally, a hierarchical density-based clustering algorithm (HDBSCAN) is used to classify cells. We compare cell classification results obtained from different combinations of features to identify a feature set that delivers optimum classification performance for our test data consisting of phase-contrast microscopy images of a pancreatic-cancer cell line, MIA PaCa-2.


Author(s):  
Etienne de Harven

Biological ultrastructures have been extensively studied with the scanning electron microscope (SEM) for the past 12 years mainly because this instrument offers accurate and reproducible high resolution images of cell shapes, provided the cells are dried in ways which will spare them the damage which would be caused by air drying. This can be achieved by several techniques among which the critical point drying technique of T. Anderson has been, by far, the most reproducibly successful. Many biologists, however, have been interpreting SEM micrographs in terms of an exclusive secondary electron imaging (SEI) process in which the resolution is primarily limited by the spot size of the primary incident beam. in fact, this is not the case since it appears that high resolution, even on uncoated samples, is probably compromised by the emission of secondary electrons of much more complex origin.When an incident primary electron beam interacts with the surface of most biological samples, a large percentage of the electrons penetrate below the surface of the exposed cells.


2021 ◽  
pp. 004051752110191
Author(s):  
Beti Rogina-Car ◽  
Stana Kovačević

The aim of this study was to investigate the damage to cotton fabrics (ticking and damask) caused by stitching with three types of needle point shapes (R, SES and SUK) and four needle sizes (70, 80, 90 and 100 Nm). Damage to the yarn and the surface area of the hole were investigated. Based on the results, it can be concluded that two types of damage occur during sewing: the needle passes through the warp/weft (it displaces the yarn) and the needle damages the warp/weft. An analysis and comparison of the surface area of the holes was carried out, obtained by a computer program based on microscopic images. The results show greater damage to the yarn at the needle piercing point in the ticking due to higher density, friction and low yarn migration. The largest surface area of the holes was produced when sewing with SUK-designated needles on ticking and damask. When sewing damask, R-designated needles cause the least damage to the piercing point, whereas SES-designated needles give the best results when sewing the ticking. Thread damage was further confirmed by testing the tensile properties of the yarn at the needle piercing points.


2008 ◽  
Vol 35 (3) ◽  
pp. 728-738 ◽  
Author(s):  
Esin Dogantekin ◽  
Mustafa Yilmaz ◽  
Akif Dogantekin ◽  
Engin Avci ◽  
Abdulkadir Sengur

Author(s):  
Rozita Rastghalam ◽  
Habibollah Danyali ◽  
Mohammad Sadegh Helfroush ◽  
M. Emre Celebi ◽  
Mojgan Mokhtari

PLoS ONE ◽  
2020 ◽  
Vol 15 (6) ◽  
pp. e0234806 ◽  
Author(s):  
Bartosz Zieliński ◽  
Agnieszka Sroka-Oleksiak ◽  
Dawid Rymarczyk ◽  
Adam Piekarczyk ◽  
Monika Brzychczy-Włoch

2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Fetulhak Abdurahman ◽  
Kinde Anlay Fante ◽  
Mohammed Aliy

Abstract Background Manual microscopic examination of Leishman/Giemsa stained thin and thick blood smear is still the “gold standard” for malaria diagnosis. One of the drawbacks of this method is that its accuracy, consistency, and diagnosis speed depend on microscopists’ diagnostic and technical skills. It is difficult to get highly skilled microscopists in remote areas of developing countries. To alleviate this problem, in this paper, we propose to investigate state-of-the-art one-stage and two-stage object detection algorithms for automated malaria parasite screening from microscopic image of thick blood slides. Results YOLOV3 and YOLOV4 models, which are state-of-the-art object detectors in accuracy and speed, are not optimized for detecting small objects such as malaria parasites in microscopic images. We modify these models by increasing feature scale and adding more detection layers to enhance their capability of detecting small objects without notably decreasing detection speed. We propose one modified YOLOV4 model, called YOLOV4-MOD and two modified models of YOLOV3, which are called YOLOV3-MOD1 and YOLOV3-MOD2. Besides, new anchor box sizes are generated using K-means clustering algorithm to exploit the potential of these models in small object detection. The performance of the modified YOLOV3 and YOLOV4 models were evaluated on a publicly available malaria dataset. These models have achieved state-of-the-art accuracy by exceeding performance of their original versions, Faster R-CNN, and SSD in terms of mean average precision (mAP), recall, precision, F1 score, and average IOU. YOLOV4-MOD has achieved the best detection accuracy among all the other models with a mAP of 96.32%. YOLOV3-MOD2 and YOLOV3-MOD1 have achieved mAP of 96.14% and 95.46%, respectively. Conclusions The experimental results of this study demonstrate that performance of modified YOLOV3 and YOLOV4 models are highly promising for detecting malaria parasites from images captured by a smartphone camera over the microscope eyepiece. The proposed system is suitable for deployment in low-resource setting areas.


Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3068
Author(s):  
Soumaya Dghim ◽  
Carlos M. Travieso-González ◽  
Radim Burget

The use of image processing tools, machine learning, and deep learning approaches has become very useful and robust in recent years. This paper introduces the detection of the Nosema disease, which is considered to be one of the most economically significant diseases today. This work shows a solution for recognizing and identifying Nosema cells between the other existing objects in the microscopic image. Two main strategies are examined. The first strategy uses image processing tools to extract the most valuable information and features from the dataset of microscopic images. Then, machine learning methods are applied, such as a neural network (ANN) and support vector machine (SVM) for detecting and classifying the Nosema disease cells. The second strategy explores deep learning and transfers learning. Several approaches were examined, including a convolutional neural network (CNN) classifier and several methods of transfer learning (AlexNet, VGG-16 and VGG-19), which were fine-tuned and applied to the object sub-images in order to identify the Nosema images from the other object images. The best accuracy was reached by the VGG-16 pre-trained neural network with 96.25%.


2021 ◽  
Vol 9 (5) ◽  
pp. 916
Author(s):  
Huan Zhang ◽  
Srutha Venkatesan ◽  
Beiyan Nan

A fundamental question in biology is how cell shapes are genetically encoded and enzymatically generated. Prevalent shapes among walled bacteria include spheres and rods. These shapes are chiefly determined by the peptidoglycan (PG) cell wall. Bacterial division results in two daughter cells, whose shapes are predetermined by the mother. This makes it difficult to explore the origin of cell shapes in healthy bacteria. In this review, we argue that the Gram-negative bacterium Myxococcus xanthus is an ideal model for understanding PG assembly and bacterial morphogenesis, because it forms rods and spheres at different life stages. Rod-shaped vegetative cells of M. xanthus can thoroughly degrade their PG and form spherical spores. As these spores germinate, cells rebuild their PG and reestablish rod shape without preexisting templates. Such a unique sphere-to-rod transition provides a rare opportunity to visualize de novo PG assembly and rod-like morphogenesis in a well-established model organism.


Sign in / Sign up

Export Citation Format

Share Document