INTRA- AND INTER-ITERATION 3D OSEM PET IMAGE RECONSTRUCTION

2007 ◽  
Vol 19 (04) ◽  
pp. 239-249
Author(s):  
Wei-Min Jeng ◽  
Yu-Liang Hsu

Most of the recent medical imaging modalities use noninvasive means to obtain the activity information inside human organs, so doctors may detect the initial symptoms of a disease as early as possible and give appropriate treatment. PET is to use radio-isotopes which can emit positrons in its clinical and research uses. By injecting the nuclear medicine drug formed by labeling a radioelement to molecules of deoxidized glucose to a patient and after the cells in the patient body absorbing it through metabolic functions, the detectors will receive annihilation coincidence events formed from the number of the positrons generated from the response of the labeled deoxidized glucose molecules. The most critical module of the modality therefore is the procedure regarding how to reconstruct good quality images using the collected projection information. However, in the reconstruction process, MLEM involves massive data in considerable number of iterations in order to yield accurate images and takes quite a long time in computation. Ordered Subsets Expectation Maximization (OSEM) was proposed to accelerate the reconstruction process by expediting the convergence while maintaining the same image quality as those produced by MLEM. Since then, OSEM iterative algorithm has become the de facto reconstruction method adopted by most PET installations. To further improve the image quality, both clinical and research data have been acquired in 3D mode on the majority of the current systems. The accompanied computational load of iterative reconstruction increases considerably resulting from the 3D OSEM method. Attributed to the fact of the recent technological advancement, many high-performance parallel methods have been proposed to speed up the reconstruction process. These methods in general are to partition data into several sets before applying any parallel acceleration. They do not take on the nature of OSEM method to identify the intrinsic data dependencies. This project intends to analyze the iterative natures of the 3D OSEM method, particularly the intra- and inter-iteration aspects of the reconstruction method along with the latest shared-memory parallel machine architecture. Experiments will be conducted to demonstrate its superior performance over the existing methods.

2021 ◽  
Author(s):  
Karl Marrett ◽  
Muye Zhu ◽  
Yuze Chi ◽  
Zhe Chen ◽  
Chris Choi ◽  
...  

Interpreting the influx of microscopy and neuroimaging data is bottlenecked by neuronal reconstruction's long-standing issues in accuracy, automation, and scalability. Rapidly increasing data size is particularly concerning for modern computing infrastructure due to the wall in memory bandwidth which historically has witnessed the slowest rate of technological advancement. Recut is an end to end reconstruction pipeline that takes raw large-volume light microscopy images and yields filtered or tuned automated neuronal reconstructions that require minimal proofreading and no other manual intervention. By leveraging adaptive grids and other methods, Recut also has a unified data representation with up to a 509× reduction in memory footprint resulting in an 89.5× throughput increase and enabling an effective 64× increase in the scale of volumes that can be skeletonized on servers or resource limited devices. Recut also employs coarse and fine-grained parallelism to achieve speedup factors beyond CPU core count in sparse settings when compared to the current fastest reconstruction method. By leveraging the sparsity in light microscopy datasets, this can allow full brains to be processed in-memory, a property which may significantly shift the compute needs of the neuroimaging community. The scale and speed of Recut fundamentally changes the reconstruction process, allowing an interactive yet deeply automated workflow.


2014 ◽  
Vol 599-601 ◽  
pp. 1411-1415
Author(s):  
Yan Hai Wu ◽  
Meng Xin Ma ◽  
Nan Wu ◽  
Jing Wang

The traditional reconstruction method of Compressive Sensing (CS) was mostly depended on L1-norm linear regression model. And here we propose Bayesian Compressive Sensing (BCS) to reconstruct the signal. It provides posterior distribution of the parameter rather than point estimate, so we can get the uncertainty of the estimation to optimize the data reconstruction process adaptively. In this paper, we employ hierarchical form of Laplace prior, and aiming at improving the efficiency of reconstruction, we segment image into blocks, employ various sample rates to compress different kinds of block and utilize relevance vector machine (RVM) to sparse signal in the reconstruction process. At last, we provide experimental result of image, and compare with the state-of-the-art CS algorithms, it demonstrating the superior performance of the proposed approach.


2016 ◽  
Vol 39 (3) ◽  
pp. 172-188
Author(s):  
Naoki Sunaguchi ◽  
Yoshiki Yamakoshi ◽  
Takahito Nakajima

This study investigates shear wave phase map reconstruction using a limited number of color flow images (CFIs) acquired with a color Doppler ultrasound imaging instrument. We propose an efficient reconstruction method to considerably reduce the number of CFIs required for reconstruction and compare this method with Fourier analysis-based color Doppler shear wave imaging. The proposed method uses a two-step phase reconstruction process, including an initial phase map derived from four CFIs using an advanced iterative algorithm of optical interferometry. The second step reduces phase artifacts in the initial phase map using an iterative correction procedure that cycles between the Fourier and inverse Fourier domains while imposing directional filtering and total variation regularization. We demonstrate the efficacy of this method using synthetic and experimental data of a breast phantom and human breast tissue. Our results show that the proposed method maintains image quality and reduces the number of CFIs required to four; previous methods have required at least 32 CFIs to achieve equivalent image quality. The proposed method is applicable to real-time shear wave elastography using a continuous shear wave produced by a mechanical vibrator.


2016 ◽  
Vol 2016 ◽  
pp. 1-11 ◽  
Author(s):  
Ming Yin ◽  
Kai Yu ◽  
Zhi Wang

For low-power wireless systems, transmission data volume is a key property, which influences the energy cost and time delay of transmission. In this paper, we introduce compressive sensing to propose a compressed sampling and collaborative reconstruction framework, which enables real-time direction of arrival estimation for wireless sensor array network. In sampling part, random compressed sampling and 1-bit sampling are utilized to reduce sample data volume while making little extra requirement for hardware. In reconstruction part, collaborative reconstruction method is proposed by exploiting similar sparsity structure of acoustic signal from nodes in the same array. Simulation results show that proposed framework can reach similar performances as conventional DoA methods while requiring less than 15% of transmission bandwidth. Also the proposed framework is compared with some data compression algorithms. While simulation results show framework’s superior performance, field experiment data from a prototype system is presented to validate the results.


2006 ◽  
Vol 3 (1) ◽  
pp. 49-60 ◽  
Author(s):  
F. E. Fish

In recent years, the biomimetic approach has been utilized as a mechanism for technological advancement in the field of robotics. However, there has not been a full appreciation of the success and limitations of biomimetics. Similarities between natural and engineered systems are exhibited by convergences, which define environmental factors, which impinge upon design, and direct copying that produces innovation through integration of natural and artificial technologies. Limitations of this integration depend on the structural and mechanical differences of the two technologies and on the process by which each technology arises. The diversity of organisms that arose through evolutionary descent does not necessarily provide all possible solutions of optimal functions. However, in instances where organisms exhibit superior performance to engineered systems, features of the organism can be targeted for technology transfer. In this regard, cooperation between biologists and engineers is paramount.


2009 ◽  
Author(s):  
Naotoshi Fujita ◽  
Asumi Yamazaki ◽  
Katsuhiro Ichikawa ◽  
Yoshie Kodera

2021 ◽  
Author(s):  
Vikrant Wagle ◽  
Abdullah Yami ◽  
Michael Onoriode ◽  
Jacques Butcher ◽  
Nivika Gupta

Abstract The present paper describes the results of the formulation of an acid-soluble low ECD organoclay-free invert emulsion drilling fluid formulated with acid soluble manganese tetroxide and a specially designed bridging package. The paper also presents a short summary of field applications to date. The novel, non-damaging fluid has superior rheology resulting in lower ECD, excellent suspension properties for effective hole cleaning and barite-sag resistance while also reducing the risk of stuck pipe in high over balance applications. 95pcf high performance invert emulsion fluid (HPIEF) was formulated using an engineered bridging package comprising of acid-soluble bridging agents and an acid-soluble weighting agent viz. manganese tetroxide. The paper describes the filtration and rheological properties of the HPIEF after hot rolling at 300oF. Different tests such as contamination testing, sag-factor analysis, high temperature-high pressure rheology measurements and filter-cake breaking studies at 300oF were performed on the HPIEF. The 95pcf fluid was also subjected to particle plugging experiments to determine the invasion characteristics and the non-damaging nature of the fluids. The 95pcf HPIEF exhibited optimal filtration properties at high overbalance conditions. The low PV values and rheological profile support low ECDs while drilling. The static aging tests performed on the 95pcf HPIEF resulted in a sag factor of less than 0.53, qualifying the inherent stability for expected downhole conditions. The HPIEF demonstrated resilience to contamination testing with negligible change in properties. Filter-cake breaking experiments performed using a specially designed breaker fluid system gave high filter-cake breaking efficiency. Return permeability studies were performed with the HPIEF against synthetic core material, results of which confirmed the non-damaging design of the fluid. The paper thus demonstrates the superior performance of the HPIEF in achieving the desired lab and field performance.


Sensors ◽  
2018 ◽  
Vol 18 (9) ◽  
pp. 3084 ◽  
Author(s):  
Kyoungsoo Bok ◽  
Daeyun Kim ◽  
Jaesoo Yoo

As a large amount of stream data are generated through sensors over the Internet of Things environment, studies on complex event processing have been conducted to detect information required by users or specific applications in real time. A complex event is made by combining primitive events through a number of operators. However, the existing complex event-processing methods take a long time because they do not consider similarity and redundancy of operators. In this paper, we propose a new complex event-processing method considering similar and redundant operations for stream data from sensors in real time. In the proposed method, a similar operation in common events is converted into a virtual operator, and redundant operations on the same events are converted into a single operator. The event query tree for complex event detection is reconstructed using the converted operators. Through this method, the cost of comparison and inspection of similar and redundant operations is reduced, thereby decreasing the overall processing cost. To prove the superior performance of the proposed method, its performance is evaluated in comparison with existing methods.


2009 ◽  
Vol 419-420 ◽  
pp. 1-4 ◽  
Author(s):  
Ying Wei Yun ◽  
Ii Young Jang ◽  
Seong Kyum Kim ◽  
Seung Min Park

High-performance concrete (HPC) as a promising construction material has been widely used in infrastructures and high-rise buildings etc. However, its pretty high autogenous shrinkage (AS) especially in its early age becomes one of the key problems endangering long-time durability of HPC structures. This paper carried out the early age AS research of large scaled HPC column specimens by embedded Fiber Bragg-Grating (FBG) strain sensor. Temperature compensation for FBG strain sensor by thermocouple was also attempted in this paper, and the results were reasonable and acceptable comparing with the result compensated by FBG temperature sensor. Reinforcement influence, size effect and temperature effect on HPC AS were also analyzed respectively in this paper.


2012 ◽  
Vol 155-156 ◽  
pp. 440-444
Author(s):  
He Yan ◽  
Xiu Feng Wang

JPEG2000 algorithm has been developed based on the DWT techniques, which have shown how the results achieved in different areas in information technology can be applied to enhance the performance. Lossy image compression algorithms sacrifice perfect image reconstruction in favor of decreased storage requirements. Wavelets have become a popular technology for information redistribution for high-performance image compression algorithms. Lossy compression algorithms sacrifice perfect image reconstruction in favor of improved compression rates while minimizing image quality lossy.


Sign in / Sign up

Export Citation Format

Share Document