scholarly journals Segmenting radiographs of the hand and wrist using computer vision

1990 ◽  
Author(s):  
Γεώργιος Μάνος

"Bone age" age assessment is an important clinical tool in the area of paediatrics. The technique is based upon the appearance and growth of specific bones in a developing child. In particular most methods for "bone age" assessment are based on the examination of the growth of bones of the left hand and wrist on X-ray films. This assessment is useful in the treatment of growth disorders and also is used to predict adult height. One of the most reliable methods for "bone age" assessment is the TW2 method. The drawback of this method is that it is time consuming and therefore its automation is highly desirable. One of the most important aspects of the automation process is image segmentation i.e. the extraction of bones from soft-tissue and background. Over the past 10 years various attempts have been made at the segmentation of handwrist radiographs but with limited success. This can mainly be attributed to the characteristics of the scenes e.g. biological objects, penetrating nature of radiation, faint bone boundaries, uncertainty of scene content, and conjugation of bones. Experience in the field of radiographic image analysis has shown thatsegmentation of radiographic scenes is a difficult task requiring solutions which depend on the nature of the particular problem.There are two main approaches to image segmentation: edge based and region based. Most of the previous attempts at the segmentation of hand-wrist radiographs were edge based. Edge based methods usually require a w-ell defined model of the object boundaries in order to produce successful results. However, for this particular application it is difficult to derive such a model. Region based segmentation methods have produced promising results for scenes which exhibit uncertainty regarding their content and boundaries of objects in the image, as in the case, for example, of natural senes. This thesis presents a segmentation method based on the concept of regions. This method consists of region growing and region merging stages. A technique was developed for region merging which combines edge and region boundai^ information. A bone extraction stage follows which labels regions as either boneor background using heuristic rules based on the grey-level properties of the scene. Finally, a technique is proposed for the segmentation of bone outlines which helps in identifying conjugated bones. Experimental results have demonstrated that this method represents a significant improvement over existing segmentation methods for hand-wrist radiographs, particularly with regard to the segmentation of radiographs with varying degrees of bone maturity.

2019 ◽  
Author(s):  
Klara Maratova ◽  
Dana Zemkova ◽  
Jan Lebl ◽  
Ondrej Soucek ◽  
Stepanka Pruhova ◽  
...  

Diagnostics ◽  
2021 ◽  
Vol 11 (5) ◽  
pp. 765
Author(s):  
Mohd Asyraf Zulkifley ◽  
Nur Ayuni Mohamed ◽  
Siti Raihanah Abdani ◽  
Nor Azwan Mohamed Kamari ◽  
Asraf Mohamed Moubark ◽  
...  

Skeletal bone age assessment using X-ray images is a standard clinical procedure to detect any anomaly in bone growth among kids and babies. The assessed bone age indicates the actual level of growth, whereby a large discrepancy between the assessed and chronological age might point to a growth disorder. Hence, skeletal bone age assessment is used to screen the possibility of growth abnormalities, genetic problems, and endocrine disorders. Usually, the manual screening is assessed through X-ray images of the non-dominant hand using the Greulich–Pyle (GP) or Tanner–Whitehouse (TW) approach. The GP uses a standard hand atlas, which will be the reference point to predict the bone age of a patient, while the TW uses a scoring mechanism to assess the bone age using several regions of interest information. However, both approaches are heavily dependent on individual domain knowledge and expertise, which is prone to high bias in inter and intra-observer results. Hence, an automated bone age assessment system, which is referred to as Attention-Xception Network (AXNet) is proposed to automatically predict the bone age accurately. The proposed AXNet consists of two parts, which are image normalization and bone age regression modules. The image normalization module will transform each X-ray image into a standardized form so that the regressor network can be trained using better input images. This module will first extract the hand region from the background, which is then rotated to an upright position using the angle calculated from the four key-points of interest. Then, the masked and rotated hand image will be aligned such that it will be positioned in the middle of the image. Both of the masked and rotated images will be obtained through existing state-of-the-art deep learning methods. The last module will then predict the bone age through the Attention-Xception network that incorporates multiple layers of spatial-attention mechanism to emphasize the important features for more accurate bone age prediction. From the experimental results, the proposed AXNet achieves the lowest mean absolute error and mean squared error of 7.699 months and 108.869 months2, respectively. Therefore, the proposed AXNet has demonstrated its potential for practical clinical use with an error of less than one year to assist the experts or radiologists in evaluating the bone age objectively.


2021 ◽  
pp. 036354652110329
Author(s):  
Cary S. Politzer ◽  
James D. Bomar ◽  
Hakan C. Pehlivan ◽  
Pradyumna Gurusamy ◽  
Eric W. Edmonds ◽  
...  

Background: In managing pediatric knee conditions, an accurate bone age assessment is often critical for diagnostic, prognostic, and treatment purposes. Historically, the Greulich and Pyle atlas (hand atlas) has been the gold standard bone age assessment tool. In 2013, a shorthand bone age assessment tool based on this atlas (hand shorthand) was devised as a simpler and more efficient alternative. Recently, a knee magnetic resonance imaging (MRI) bone age atlas (MRI atlas) was created to circumvent the need for a left-hand radiograph. Purpose: To create a shorthand version of the knee MRI atlas. Study Design: Cohort study (diagnosis); Level of evidence, 2. Methods: A shorthand bone age assessment method was created utilizing the previously published MRI atlas, which utilizes several criteria that are visualized across a series of images. The MRI shorthand draws on characteristic criteria for each age that are best observed on a single MRI scan. For validation, we performed a retrospective assessment of skeletally immature patients. One reader performed the bone age assessment using the MRI atlas and the MRI shorthand on 200 patients. Then, 4 readers performed the bone age assessment with the hand atlas, hand shorthand, MRI atlas, and MRI shorthand on a subset of 22 patients in a blinded fashion. All 22 patients had a knee MRI scan and a left-hand radiograph within 4 weeks of each other. Interobserver and intraobserver reliability, as well as variability among observers, were evaluated. Results: A total of 200 patients with a mean age of 13.5 years (range, 9.08-17.98 years) were included in this study. Also, 22 patients with a mean age of 13.3 years (range, 9.0-15.6 years) had a knee MRI scan and a left-hand radiograph within 4 weeks. The intraobserver and interobserver reliability of all 4 assessment tools were acceptable (intraclass correlation coefficient [ICC] ≥ 0.8; P < .001). When comparing the MRI shorthand with the MRI atlas, there was excellent agreement (ICC = 0.989), whereas the hand shorthand compared with the hand atlas had good agreement (ICC = 0.765). The MRI shorthand also had perfect agreement in 50% of readings among all 4 readers, and 95% of readings had agreement within 1 year, whereas the hand shorthand had perfect agreement in 32% of readings and 77% agreement within 1 year. Conclusion: The MRI shorthand is a simple and efficient means of assessing the skeletal maturity of adolescent patients with a knee MRI scan. This bone age assessment technique had interobserver and intraobserver reliability equivalent to or better than the standard method of utilizing a left-hand radiograph.


Author(s):  
Premal Naik ◽  
Dhren Ganjwala ◽  
Chhaya Bhatt ◽  
Kranti Suresh Vora

2015 ◽  
Vol 45 (7) ◽  
pp. 1007-1015 ◽  
Author(s):  
Monica Daneff ◽  
Claudia Casalis ◽  
Claudio H. Bruno ◽  
Didier A. Bruno

2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Chen Zhao ◽  
Jungang Han ◽  
Yang Jia ◽  
Lianghui Fan ◽  
Fan Gou

Deep learning technique has made a tremendous impact on medical image processing and analysis. Typically, the procedure of medical image processing and analysis via deep learning technique includes image segmentation, image enhancement, and classification or regression. A challenge for supervised deep learning frequently mentioned is the lack of annotated training data. In this paper, we aim to address the problems of training transferred deep neural networks with limited amount of annotated data. We proposed a versatile framework for medical image processing and analysis via deep active learning technique. The framework includes (1) applying deep active learning approach to segment specific regions of interest (RoIs) from raw medical image by using annotated data as few as possible; (2) generative adversarial Network is employed to enhance contrast, sharpness, and brightness of segmented RoIs; (3) Paced Transfer Learning (PTL) strategy which means fine-tuning layers in deep neural networks from top to bottom step by step to perform medical image classification or regression tasks. In addition, in order to understand the necessity of deep-learning-based medical image processing tasks and provide clues for clinical usage, class active map (CAM) is employed in our framework to visualize the feature maps. To illustrate the effectiveness of the proposed framework, we apply our framework to the bone age assessment (BAA) task using RSNA dataset and achieve the state-of-the-art performance. Experimental results indicate that the proposed framework can be effectively applied to medical image analysis task.


Sign in / Sign up

Export Citation Format

Share Document