Improved attenuation correction for freely moving animal brain PET studies using a virtual scanner geometry

2014 ◽  
Author(s):  
Georgios I. Angelis ◽  
William J. Ryder ◽  
Andre Z. Kyme ◽  
Roger R. Fulton ◽  
Steven R. Meikle
2014 ◽  
Vol 59 (19) ◽  
pp. 5651-5666 ◽  
Author(s):  
G I Angelis ◽  
A Z Kyme ◽  
W J Ryder ◽  
R R Fulton ◽  
S R Meikle

Electronics ◽  
2021 ◽  
Vol 10 (15) ◽  
pp. 1836
Author(s):  
Bo-Hye Choi ◽  
Donghwi Hwang ◽  
Seung-Kwan Kang ◽  
Kyeong-Yun Kim ◽  
Hongyoon Choi ◽  
...  

The lack of physically measured attenuation maps (μ-maps) for attenuation and scatter correction is an important technical challenge in brain-dedicated stand-alone positron emission tomography (PET) scanners. The accuracy of the calculated attenuation correction is limited by the nonuniformity of tissue composition due to pathologic conditions and the complex structure of facial bones. The aim of this study is to develop an accurate transmission-less attenuation correction method for amyloid-β (Aβ) brain PET studies. We investigated the validity of a deep convolutional neural network trained to produce a CT-derived μ-map (μ-CT) from simultaneously reconstructed activity and attenuation maps using the MLAA (maximum likelihood reconstruction of activity and attenuation) algorithm for Aβ brain PET. The performance of three different structures of U-net models (2D, 2.5D, and 3D) were compared. The U-net models generated less noisy and more uniform μ-maps than MLAA μ-maps. Among the three different U-net models, the patch-based 3D U-net model reduced noise and cross-talk artifacts more effectively. The Dice similarity coefficients between the μ-map generated using 3D U-net and μ-CT in bone and air segments were 0.83 and 0.67. All three U-net models showed better voxel-wise correlation of the μ-maps compared to MLAA. The patch-based 3D U-net model was the best. While the uptake value of MLAA yielded a high percentage error of 20% or more, the uptake value of 3D U-nets yielded the lowest percentage error within 5%. The proposed deep learning approach that requires no transmission data, anatomic image, or atlas/template for PET attenuation correction remarkably enhanced the quantitative accuracy of the simultaneously estimated MLAA μ-maps from Aβ brain PET.


2009 ◽  
Vol 12 (3) ◽  
pp. 250-258 ◽  
Author(s):  
Jin Su Kim ◽  
Jae Sung Lee ◽  
Min-Hyun Park ◽  
Kyeong Min Kim ◽  
Seung-Ha Oh ◽  
...  

2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Jorge Cabello ◽  
Mihai Avram ◽  
Felix Brandl ◽  
Mona Mustafa ◽  
Martin Scherr ◽  
...  

2020 ◽  
Vol 33 (5) ◽  
pp. 1224-1241 ◽  
Author(s):  
Imene Mecheter ◽  
Lejla Alic ◽  
Maysam Abbod ◽  
Abbes Amira ◽  
Jim Ji

Abstract Recent emerging hybrid technology of positron emission tomography/magnetic resonance (PET/MR) imaging has generated a great need for an accurate MR image-based PET attenuation correction. MR image segmentation, as a robust and simple method for PET attenuation correction, has been clinically adopted in commercial PET/MR scanners. The general approach in this method is to segment the MR image into different tissue types, each assigned an attenuation constant as in an X-ray CT image. Machine learning techniques such as clustering, classification and deep networks are extensively used for brain MR image segmentation. However, only limited work has been reported on using deep learning in brain PET attenuation correction. In addition, there is a lack of clinical evaluation of machine learning methods in this application. The aim of this review is to study the use of machine learning methods for MR image segmentation and its application in attenuation correction for PET brain imaging. Furthermore, challenges and future opportunities in MR image-based PET attenuation correction are discussed.


2016 ◽  
Vol 61 (19) ◽  
pp. 7074-7091 ◽  
Author(s):  
Matthew G Spangler-Bickell ◽  
Lin Zhou ◽  
Andre Z Kyme ◽  
Bart De Laat ◽  
Roger R Fulton ◽  
...  

2008 ◽  
Vol 53 (10) ◽  
pp. 2651-2666 ◽  
Author(s):  
A Z Kyme ◽  
V W Zhou ◽  
S R Meikle ◽  
R R Fulton

Sign in / Sign up

Export Citation Format

Share Document