scholarly journals A parallel and automatically tuned algorithm for multispectral image deconvolution

2019 ◽  
Vol 490 (1) ◽  
pp. 37-49
Author(s):  
R Ammanouil ◽  
A Ferrari ◽  
D Mary ◽  
C Ferrari ◽  
F Loi

ABSTRACT In the era of big data, radio astronomical image reconstruction algorithms are challenged to estimate clean images given limited computing resources and time. This article is driven by the need for large-scale image reconstruction for the future Square Kilometre Array (SKA), which will become in the next decades the largest low and intermediate frequency radio telescope in the world. This work proposes a scalable wide-band deconvolution algorithm called MUFFIN, which stands for ‘MUlti Frequency image reconstruction For radio INterferometry’. MUFFIN estimates the sky images in various frequency bands, given the corresponding dirty images and point spread functions. The reconstruction is achieved by minimizing a data fidelity term and joint spatial and spectral sparse analysis regularization terms. It is consequently non-parametric w.r.t. the spectral behaviour of radio sources. MUFFIN algorithm is endowed with a parallel implementation and an automatic tuning of the regularization parameters, making it scalable and well suited for big data applications such as SKA. Comparisons between MUFFIN and the state-of-the-art wide-band reconstruction algorithm are provided.

2017 ◽  
Vol 2017 ◽  
pp. 1-10
Author(s):  
Hsuan-Ming Huang ◽  
Ing-Tsung Hsiao

Background and Objective. Over the past decade, image quality in low-dose computed tomography has been greatly improved by various compressive sensing- (CS-) based reconstruction methods. However, these methods have some disadvantages including high computational cost and slow convergence rate. Many different speed-up techniques for CS-based reconstruction algorithms have been developed. The purpose of this paper is to propose a fast reconstruction framework that combines a CS-based reconstruction algorithm with several speed-up techniques.Methods. First, total difference minimization (TDM) was implemented using the soft-threshold filtering (STF). Second, we combined TDM-STF with the ordered subsets transmission (OSTR) algorithm for accelerating the convergence. To further speed up the convergence of the proposed method, we applied the power factor and the fast iterative shrinkage thresholding algorithm to OSTR and TDM-STF, respectively.Results. Results obtained from simulation and phantom studies showed that many speed-up techniques could be combined to greatly improve the convergence speed of a CS-based reconstruction algorithm. More importantly, the increased computation time (≤10%) was minor as compared to the acceleration provided by the proposed method.Conclusions. In this paper, we have presented a CS-based reconstruction framework that combines several acceleration techniques. Both simulation and phantom studies provide evidence that the proposed method has the potential to satisfy the requirement of fast image reconstruction in practical CT.


2022 ◽  
pp. 1-13
Author(s):  
Lei Shi ◽  
Gangrong Qu ◽  
Yunsong Zhao

BACKGROUND: Ultra-limited-angle image reconstruction problem with a limited-angle scanning range less than or equal to π 2 is severely ill-posed. Due to the considerably large condition number of a linear system for image reconstruction, it is extremely challenging to generate a valid reconstructed image by traditional iterative reconstruction algorithms. OBJECTIVE: To develop and test a valid ultra-limited-angle CT image reconstruction algorithm. METHODS: We propose a new optimized reconstruction model and Reweighted Alternating Edge-preserving Diffusion and Smoothing algorithm in which a reweighted method of improving the condition number is incorporated into the idea of AEDS image reconstruction algorithm. The AEDS algorithm utilizes the property of image sparsity to improve partially the results. In experiments, the different algorithms (the Pre-Landweber, AEDS algorithms and our algorithm) are used to reconstruct the Shepp-Logan phantom from the simulated projection data with noises and the flat object with a large ratio between length and width from the real projection data. PSNR and SSIM are used as the quantitative indices to evaluate quality of reconstructed images. RESULTS: Experiment results showed that for simulated projection data, our algorithm improves PSNR and SSIM from 22.46db to 39.38db and from 0.71 to 0.96, respectively. For real projection data, our algorithm yields the highest PSNR and SSIM of 30.89db and 0.88, which obtains a valid reconstructed result. CONCLUSIONS: Our algorithm successfully combines the merits of several image processing and reconstruction algorithms. Thus, our new algorithm outperforms significantly other two algorithms and is valid for ultra-limited-angle CT image reconstruction.


Author(s):  
Joaquin Vanschoren ◽  
Ugo Vespier ◽  
Shengfa Miao ◽  
Marvin Meeng ◽  
Ricardo Cachucho ◽  
...  

Sensors are increasingly being used to monitor the world around us. They measure movements of structures such as bridges, windmills, and plane wings, human’s vital signs, atmospheric conditions, and fluctuations in power and water networks. In many cases, this results in large networks with different types of sensors, generating impressive amounts of data. As the volume and complexity of data increases, their effective use becomes more challenging, and novel solutions are needed both on a technical as well as a scientific level. Founded on several real-world applications, this chapter discusses the challenges involved in large-scale sensor data analysis and describes practical solutions to address them. Due to the sheer size of the data and the large amount of computation involved, these are clearly “Big Data” applications.


Web Services ◽  
2019 ◽  
pp. 953-978
Author(s):  
Krishnan Umachandran ◽  
Debra Sharon Ferdinand-James

Continued technological advancements of the 21st Century afford massive data generation in sectors of our economy to include the domains of agriculture, manufacturing, and education. However, harnessing such large-scale data, using modern technologies for effective decision-making appears to be an evolving science that requires knowledge of Big Data management and analytics. Big data in agriculture, manufacturing, and education are varied such as voluminous text, images, and graphs. Applying Big data science techniques (e.g., functional algorithms) for extracting intelligence data affords decision markers quick response to productivity, market resilience, and student enrollment challenges in today's unpredictable markets. This chapter serves to employ data science for potential solutions to Big Data applications in the sectors of agriculture, manufacturing and education to a lesser extent, using modern technological tools such as Hadoop, Hive, Sqoop, and MongoDB.


Author(s):  
Krishnan Umachandran ◽  
Debra Sharon Ferdinand-James

Continued technological advancements of the 21st Century afford massive data generation in sectors of our economy to include the domains of agriculture, manufacturing, and education. However, harnessing such large-scale data, using modern technologies for effective decision-making appears to be an evolving science that requires knowledge of Big Data management and analytics. Big data in agriculture, manufacturing, and education are varied such as voluminous text, images, and graphs. Applying Big data science techniques (e.g., functional algorithms) for extracting intelligence data affords decision markers quick response to productivity, market resilience, and student enrollment challenges in today's unpredictable markets. This chapter serves to employ data science for potential solutions to Big Data applications in the sectors of agriculture, manufacturing and education to a lesser extent, using modern technological tools such as Hadoop, Hive, Sqoop, and MongoDB.


2016 ◽  
Vol 11 (2) ◽  
pp. 103-109
Author(s):  
Hongtu Zhao ◽  
Chong Chen ◽  
Chenxu Shi

As the most critical part of compressive sensing theory, reconstruction algorithm has an impact on the quality and speed of image reconstruction. After studying some existing convex optimization algorithms and greedy algorithms, we find that convex optimization algorithms should possess higher complexity to achieve higher reconstruction quality. Also, fixed atomic numbers used in most greedy algorithms increase the complexity of reconstruction. In this context, we propose a novel algorithm, called variable atomic number matching pursuit, which can improve the accuracy and speed of reconstruction. Simulation results show that variable atomic number matching pursuit is a fast and stable reconstruction algorithm and better than the other reconstruction algorithms under the same conditions.


2020 ◽  
Author(s):  
Manuel Weber ◽  
Regina Hofferber ◽  
Ken Herrmann ◽  
Wolfgang Peter Fendler ◽  
Maurizio Conti ◽  
...  

Abstract Aim 68Ga-PSMA PET/CT allows for a superior detection of prostate cancer (PC) tissue, especially in context of a low tumor burden. Digital PET/CT bears the potential of reducing scan time duration / administered tracer activity due to, for instance, its higher sensitivity and improved time coincidence resolution. It might thereby expand 68Ga-PSMA PET/CT that is currently limited by 68Ge/68Ga-generator yield. Our aim was to clinically evaluate the influence of a reduced scan time duration in combination with different image reconstruction algorithms on the diagnostic performance. Methods Twenty PC patients (11 for biochemical recurrence, 5 for initial staging, 4 for metastatic disease) sequentially underwent 68Ga-PSMA PET/CT on a digital Siemens Biograph Vision. PET data were collected in continuous-bed-motion mode with a scan time duration of approximately 17 min (reference acquisition protocol) and 5 min (reduced acquisition protocol). 4 iterative reconstruction algorithms were applied using a time-of-flight (TOF) approach alone or combined with point-spread-function (PSF) correction, each with 2 or 4 iterations. To evaluate the diagnostic performance, the following metrics were chosen: (a) per-region detectability, (b) the tumor maximum and peak standardized uptake values (SUVmax and SUVpeak) and (c) image noise using the liver’s activity distribution. Results Overall, 98% of regions (91% of affected regions) were correctly classified in the reduced acquisition protocol independent of the image reconstruction algorithm. Two nodal lesions (each ≤ 4 mm) were not identified (leading to downstaging in 1/20 cases). Mean absolute percentage deviation of SUVmax (SUVpeak) was approximately 9% (6%) for each reconstruction algorithm. The mean image noise increased from 13–21% (4 iterations) and from 10–15% (2 iterations) for PSF + TOF and TOF images. Conclusions High agreement at 3.5-fold reduction of scan time in terms of per-region detection (98% of regions) and image quantification (mean deviation ≤ 10%) was demonstrated; however, small lesions can be missed in about 10% of patients leading to downstaging (T1N0M0 instead of T1N1M0) in 5% of patients. Our results suggest that a reduction of scan time duration or administered 68Ga-PSMA activities can be considered in metastatic patients, where missing small lesions would not impact patient management.


2019 ◽  
Author(s):  
Alex M. Ascension ◽  
Marcos J. Araúzo-Bravo

AbstractBig Data analysis is a discipline with a growing number of areas where huge amounts of data is extracted and analyzed. Parallelization in Python integrates Message Passing Interface via mpi4py module. Since mpi4py does not support parallelization of objects greater than 231 bytes, we developed BigMPI4py, a Python module that wraps mpi4py, supporting object sizes beyond this boundary. BigMPI4py automatically determines the optimal object distribution strategy, and also uses vectorized methods, achieving higher parallelization efficiency. BigMPI4py facilitates the implementation of Python for Big Data applications in multicore workstations and HPC systems. We validated BigMPI4py on whole genome bisulfite sequencing (WGBS) DNA methylation ENCODE data of 59 samples from 27 human tissues. We categorized them on the three germ layers and developed a parallel implementation of the Kruskall-Wallis test to find CpGs with differential methylation across germ layers. We observed a differentiation of the germ layers, and a set of hypermethylated genes in ectoderm and mesoderm-related tissues, and another set in endoderm-related tissues. The parallel evaluation of the significance of 55 million CpG achieved a 22x speedup with 25 cores. BigMPI4py is available at https://gitlab.com/alexmascension/bigmpi4py and the Jupyter Notebook with WGBS analysis at https://gitlab.com/alexmascension/wgbs-analysis


Author(s):  
Mohd Hafiz Fazalul Rahiman ◽  
Ruzairi Abdul Rahim ◽  
Herlina Abdul Rahim

Kertas ini membincangkan algoritma pembangunan imej bagi kegunaan dalam tomografi ultrasonik. Terdapat tiga jenis algoritma pembangunan iaitu Linear Back Projection, Hybrid Reconstruction dan Hybrid Binary Reconstruction. Algoritma tersebut telah diuji ke atas sistem tomografi ultrasonik berdasarkan kepada beberapa bayang yang telah dikenalpasti dan objek–objek sebenar. Prestasi algoritma tersebut telah di analisa dan bincangkan pada bahagian akhir kertas ini. Kata kunci: Algoritma pembangunan; tomografi ultrasonic; pemprosesan image; mabuk This paper presented image reconstruction algorithms for use in ultrasonic tomography. There are three types of reconstruction algorithms namely Linear Back Projection, Hybrid Reconstruction and Hybrid Binary Reconstruction. The algorithms have been evaluated on ultrasonic tomography system based on several known phantoms and real objects. The performance of the algorithms have been analysed and discussed at the end of the paper. Key words: Reconstruction algorithm; ultrasonic tomography; image processing


Sign in / Sign up

Export Citation Format

Share Document