scholarly journals Deep learning based pectoral muscle segmentation on Mammographic Image Analysis Society (MIAS) mammograms

Author(s):  
Young Jae Kim ◽  
Eun Young Yoo ◽  
Kwang Gi Kim
2022 ◽  
Vol 15 (1) ◽  
pp. 1-14
Author(s):  
Divyashree B. V. ◽  
Amarnath R. ◽  
Naveen M. ◽  
Hemantha Kumar G.

In this paper, pectoral muscle segmentation was performed to study the presence of malignancy in the pectoral muscle region in mammograms. A combined approach involving granular computing and layering was employed to locate the pectoral muscle in mammograms. In most cases, the pectoral muscle is found to be triangular in shape and hence, the ant colony optimization algorithm is employed to accurately estimate the pectoral muscle boundary. The proposed method works with the left mediolateral oblique (MLO) view of mammograms to avoid artifacts. For the right MLO view, the method automatically mirrors the image to the left MLO view. The performance of this method was evaluated using the standard mini MIAS dataset (mammographic image analysis society). The algorithm was tested on 322 images and the overall accuracy of the system was about 97.47 %. The method is robust with respect to the view, shape, size and reduces the processing time. The approach correctly identifies images when the pectoral muscle is completely absent.


Axioms ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 180
Author(s):  
Yoshio Rubio ◽  
Oscar Montiel

Breast segmentation plays a vital role in the automatic analysis of mammograms. Accurate segmentation of the breast region increments the probability of a correct diagnostic and minimizes computational cost. Traditionally, model-based approaches dominated the landscape for breast segmentation, but recent studies seem to benefit from using robust deep learning models for this task. In this work, we present an extensive evaluation of deep learning architectures for semantic segmentation of mammograms, including segmentation metrics, memory requirements, and average inference time. We used several combinations of two-stage segmentation architectures composed of a feature extraction net (VGG16 and ResNet50) and a segmentation net (FCN-8, U-Net, and PSPNet). The training examples were taken from the mini Mammographic Image Analysis Society (MIAS) database. Experimental results using the mini-MIAS database show that the best net scored a Dice similarity coefficient of 99.37% for breast boundary segmentation and 95.45% for pectoral muscle segmentation.


Author(s):  
Alejandro Rodriguez-Ruiz ◽  
Jonas Teuwen ◽  
Kaman Chung ◽  
Nico Karssemeijer ◽  
Margarita Chevalier ◽  
...  

2011 ◽  
Vol 121-126 ◽  
pp. 4537-4541
Author(s):  
Chen Chung Liu ◽  
Shyr Shen Yu ◽  
Chung Yen Tsai ◽  
Ta Shan Tsui

The appearance of pectoral muscle in medio-lateral oblique (MLO) views of mammograms can increase the false positive in computer aided detection (CAD) of breast cancer detection. Pectoral muscle has to be identified and segmented from the breast region in a mammogram before further analysis. The main goal of this paper is to propose an accurate and efficient algorithm of pectoral muscle extraction on MLO mammograms. The proposed algorithm bases on the positional characteristic of pectoral muscle in a breast region to combine the iterative Otsu thresholding scheme and the mathematic morphological processing to find the rough border of the pectoral muscle. The multiple regression analysis is then employed on the rough border to obtain the accurate segmentation of the pectoral muscle. The presented algorithm is tested on the digital mammograms from the Mammogram Image Analysis Society (MIAS) database. The experimental results show that the pectoral muscle extracted by the presented algorithm approximately follows that extracted by an expert radiologist.


Author(s):  
Dinesh Pothineni ◽  
Martin R. Oswald ◽  
Jan Poland ◽  
Marc Pollefeys
Keyword(s):  

Plant Methods ◽  
2021 ◽  
Vol 17 (1) ◽  
Author(s):  
Shuo Zhou ◽  
Xiujuan Chai ◽  
Zixuan Yang ◽  
Hongwu Wang ◽  
Chenxue Yang ◽  
...  

Abstract Background Maize (Zea mays L.) is one of the most important food sources in the world and has been one of the main targets of plant genetics and phenotypic research for centuries. Observation and analysis of various morphological phenotypic traits during maize growth are essential for genetic and breeding study. The generally huge number of samples produce an enormous amount of high-resolution image data. While high throughput plant phenotyping platforms are increasingly used in maize breeding trials, there is a reasonable need for software tools that can automatically identify visual phenotypic features of maize plants and implement batch processing on image datasets. Results On the boundary between computer vision and plant science, we utilize advanced deep learning methods based on convolutional neural networks to empower the workflow of maize phenotyping analysis. This paper presents Maize-IAS (Maize Image Analysis Software), an integrated application supporting one-click analysis of maize phenotype, embedding multiple functions: (I) Projection, (II) Color Analysis, (III) Internode length, (IV) Height, (V) Stem Diameter and (VI) Leaves Counting. Taking the RGB image of maize as input, the software provides a user-friendly graphical interaction interface and rapid calculation of multiple important phenotypic characteristics, including leaf sheath points detection and leaves segmentation. In function Leaves Counting, the mean and standard deviation of difference between prediction and ground truth are 1.60 and 1.625. Conclusion The Maize-IAS is easy-to-use and demands neither professional knowledge of computer vision nor deep learning. All functions for batch processing are incorporated, enabling automated and labor-reduced tasks of recording, measurement and quantitative analysis of maize growth traits on a large dataset. We prove the efficiency and potential capability of our techniques and software to image-based plant research, which also demonstrates the feasibility and capability of AI technology implemented in agriculture and plant science.


Sign in / Sign up

Export Citation Format

Share Document