Automatic segmentation and measurements of gestational sac using static B-mode ultrasound images

Author(s):  
Dheyaa Ahmed Ibrahim ◽  
Hisham Al-Assam ◽  
Hongbo Du ◽  
Jessica Farren ◽  
Dhurgham Al-karawi ◽  
...  
Author(s):  
Chenghuan Yin ◽  
Yu Wang ◽  
Qixin Zhang ◽  
Fangfang Han ◽  
Zhengwei Yuan ◽  
...  

2021 ◽  
pp. 016173462110425
Author(s):  
Jianing Xi ◽  
Jiangang Chen ◽  
Zhao Wang ◽  
Dean Ta ◽  
Bing Lu ◽  
...  

Large scale early scanning of fetuses via ultrasound imaging is widely used to alleviate the morbidity or mortality caused by congenital anomalies in fetal hearts and lungs. To reduce the intensive cost during manual recognition of organ regions, many automatic segmentation methods have been proposed. However, the existing methods still encounter multi-scale problem at a larger range of receptive fields of organs in images, resolution problem of segmentation mask, and interference problem of task-irrelevant features, obscuring the attainment of accurate segmentations. To achieve semantic segmentation with functions of (1) extracting multi-scale features from images, (2) compensating information of high resolution, and (3) eliminating the task-irrelevant features, we propose a multi-scale model with skip connection framework and attention mechanism integrated. The multi-scale feature extraction modules are incorporated with additive attention gate units for irrelevant feature elimination, through a U-Net framework with skip connections for information compensation. The performance of fetal heart and lung segmentation indicates the superiority of our method over the existing deep learning based approaches. Our method also shows competitive performance stability during the task of semantic segmentations, showing a promising contribution on ultrasound based prognosis of congenital anomaly in the early intervention, and alleviating the negative effects caused by congenital anomaly.


2018 ◽  
Vol 13 (11) ◽  
pp. 1707-1716 ◽  
Author(s):  
M. Villa ◽  
G. Dardenne ◽  
M. Nasan ◽  
H. Letissier ◽  
C. Hamitouche ◽  
...  

2016 ◽  
Vol 10 (1) ◽  
pp. 18-27 ◽  
Author(s):  
Matteo Aventaggiato ◽  
Maurizio Muratore ◽  
Paola Pisani ◽  
Aimè Lay-Ekuakille ◽  
Francesco Conversano ◽  
...  

Author(s):  
Jennifer Nitsch ◽  
Jan Klein ◽  
Dorothea Miller ◽  
Ulrich Sure ◽  
Horst K. Hahn

2018 ◽  
Vol 7 (2.7) ◽  
pp. 665
Author(s):  
Chelladurai R ◽  
Selvakumar R ◽  
S Poonguzhali

Breast cancer is one of the leading cancer that affects woman all around the world. Nowadays ultra sound imaging technique is used to diagnose various cancer because of its non-ionizing, on-invasive, and cheap cost. Breast lesion region in ultrasound images are classified depending upon the contour, shape, size and textural features of the segmented region. Seed point is the initial step in segmentation of lesion regions and if that point is located outside the lesion region, it leads to wrong segmentation which results in misclassification of the lesion regions. To avoid this, most of the time the seed point is located manually. In order to avoid this manual intervention, we are proposing a novel method in locating the seed point and also segmenting the breast lesion region automatically. In this method, the image is processed with tan function for effective distinguishing of breast lesion and normal region. Then using the trained neural network, the seed point is automatically located inside the lesion region and from the seed point the region of the lesion is grown and segmented automatically. Most of the past works on automatic segmentation of lesion had concentrated only in single lesion region, but using this proposed method, we were able to automatically segment multiple lesion regions in the image. Outcome of the proposed method is to detect automatically and dynamically separate the lesion region in the range between 90% to 97.5% of images. 


2014 ◽  
Author(s):  
Shan Khazendar ◽  
Jessica Farren ◽  
Hisham Al-Assam ◽  
Ahmed Sayasneh ◽  
Hongbo Du ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document