Unsupervised segmentation of HRCT lung images using FDK clustering

Author(s):  
P.K. Singh
Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4818
Author(s):  
Nils Mandischer ◽  
Tobias Huhn ◽  
Mathias Hüsing ◽  
Burkhard Corves

In the EU project SHAREWORK, methods are developed that allow humans and robots to collaborate in an industrial environment. One of the major contributions is a framework for task planning coupled with automated item detection and localization. In this work, we present the methods used for detecting and classifying items on the shop floor. Important in the context of SHAREWORK is the user-friendliness of the methodology. Thus, we renounce heavy-learning-based methods in favor of unsupervised segmentation coupled with lenient machine learning methods for classification. Our algorithm is a combination of established methods adjusted for fast and reliable item detection at high ranges of up to eight meters. In this work, we present the full pipeline from calibration, over segmentation to item classification in the industrial context. The pipeline is validated on a shop floor of 40 sqm and with up to nine different items and assemblies, reaching a mean accuracy of 84% at 0.85 Hz.


2017 ◽  
Vol 81 ◽  
pp. 223-243 ◽  
Author(s):  
Thaína A. Azevedo Tosta ◽  
Paulo Rogério Faria ◽  
Leandro Alves Neves ◽  
Marcelo Zanchetta do Nascimento

2014 ◽  
Vol 81 (2) ◽  
pp. 153-181 ◽  
Author(s):  
Klimis S. Ntalianis ◽  
Anastasios D. Doulamis ◽  
Nikolaos D. Doulamis ◽  
Nikos E. Mastorakis ◽  
Athanasios S. Drigas

2014 ◽  
Vol 28 (4) ◽  
pp. 499-514 ◽  
Author(s):  
Qaiser Mahmood ◽  
Artur Chodorowski ◽  
Andrew Mehnert ◽  
Johanna Gellermann ◽  
Mikael Persson

2018 ◽  
Vol 10 (8) ◽  
pp. 1193 ◽  
Author(s):  
Yongji Wang ◽  
Qingwen Qi ◽  
Ying Liu

Image segmentation is an important process and a prerequisite for object-based image analysis. Thus, evaluating the performance of segmentation algorithms is essential to identify effective segmentation methods and to optimize the scale. In this paper, we propose an unsupervised evaluation (UE) method using the area-weighted variance (WV) and Jeffries-Matusita (JM) distance to compare two image partitions to evaluate segmentation quality. The two measures were calculated based on the local measure criteria, and the JM distance was improved by considering the contribution of the common border between adjacent segments and the area of each segment in the JM distance formula, which makes the heterogeneity measure more effective and objective. Then the two measures were presented as a curve when changing the scale from 8 to 20, which can reflect the segmentation quality in both over- and under-segmentation. Furthermore, the WV and JM distance measures were combined by using three different strategies. The effectiveness of the combined indicators was illustrated through supervised evaluation (SE) methods to clearly reveal the segmentation quality and capture the trade-off between the two measures. In these experiments, the multiresolution segmentation (MRS) method was adopted for evaluation. The proposed UE method was compared with two existing UE methods to further confirm their capabilities. The visual and quantitative SE results demonstrated that the proposed UE method can improve the segmentation quality.


2012 ◽  
Vol 532-533 ◽  
pp. 732-737
Author(s):  
Xi Jie Wang ◽  
Xiao Fan Zhao

This paper presents a new multi-resolution Markov random field model in Contourlet domain for unsupervised texture image segmentation. In order to make full use of the merits of Contourlet transformation, we introduce the taditional MRMRF model into Contourlet domain, in a manner of variable interation between two components in the tradtional MRMRF model. Using this method, the new model can automatically estimate model parameters and produce accurate unsupervised segmentation results. The results obtained on synthetic texture images and remote sensing images demonstrate that a better segmentation is achieved by our model than the traditional MRMRF model.


Sign in / Sign up

Export Citation Format

Share Document