A Hardware–Software System for Tomographic Reconstruction

2003 ◽  
Vol 12 (02) ◽  
pp. 203-229 ◽  
Author(s):  
Jan Müller ◽  
Dirk Fimmel ◽  
Renate Merker ◽  
Rainer Schaffer

We present the design of a hardware–software system for the reconstruction of tomographic images. In a systematic approach we developed the parallel processor array, a reconfigurable hardware controller and processing kernel, and the software control up to the integration into a graphical user interface. The processor array acting as a hardware accelerator, is constructed using theoretical results and methods of application-specific hardware design. The reconfigurability of the system allows one to utilize a much wider realm of algorithms than the three reconstruction algorithms implemented so far. In the paper we discuss the system design at different levels from algorithm transformations to board development.

Author(s):  
Wenbing Yun ◽  
Steve Wang ◽  
David Scott ◽  
Kenneth W. Nill ◽  
Waleed S. Haddad

Abstract A high-resolution table-sized x-ray nanotomography (XRMT) tool has been constructed that shows the promise of nondestructively imaging the internal structure of a full IC stack with a spatial resolution better than 100 nm. Such a tool can be used to detect, localize, and characterize buried defects in the IC. By collecting a set of X-ray projections through the full IC (which may include tens of micrometers of silicon substrate and several layers of Cu interconnects) and applying tomographic reconstruction algorithms to these projections, a 3D volumetric reconstruction can be obtained, and analyzed for defects using 3D visualization software. XRMT is a powerful technique that will find use in failure analysis and IC process development, and may facilitate or supplant investigations using SEM, TEM, and FIB tools, which generally require destructive sample preparation and a vacuum environment.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Albert Cheu ◽  
Adam Smith ◽  
Jonathan Ullman

Local differential privacy is a widely studied restriction on distributed algorithms that collect aggregates about sensitive user data, and is now deployed in several large systems. We initiate a systematic study of a fundamental limitation of locally differentially private protocols: they are highly vulnerable to adversarial manipulation. While any algorithm can be manipulated by adversaries who lie about their inputs, we show that any noninteractive locally differentially private protocol can be manipulated to a much greater extent---when the privacy level is high, or the domain size is large, a small fraction of users in the protocol can completely obscure the distribution of the honest users' input. We also construct protocols that are optimally robust to manipulation for a variety of common tasks in local differential privacy. Finally, we give simple experiments validating our  theoretical results, and demonstrating that protocols that are optimal without manipulation can have dramatically different levels of robustness to manipulation. Our results suggest caution when deploying local differential privacy and reinforce the importance of efficient cryptographic  techniques for the distributed emulation of centrally differentially private mechanisms.


2017 ◽  
Vol 2017 ◽  
pp. 1-17 ◽  
Author(s):  
C. O. S. Sorzano ◽  
J. Vargas ◽  
J. Otón ◽  
J. M. de la Rosa-Trevín ◽  
J. L. Vilas ◽  
...  

One of the key steps in Electron Microscopy is the tomographic reconstruction of a three-dimensional (3D) map of the specimen being studied from a set of two-dimensional (2D) projections acquired at the microscope. This tomographic reconstruction may be performed with different reconstruction algorithms that can be grouped into several large families: direct Fourier inversion methods, back-projection methods, Radon methods, or iterative algorithms. In this review, we focus on the latter family of algorithms, explaining the mathematical rationale behind the different algorithms in this family as they have been introduced in the field of Electron Microscopy. We cover their use in Single Particle Analysis (SPA) as well as in Electron Tomography (ET).


2019 ◽  
Vol 2019 ◽  
pp. 1-9 ◽  
Author(s):  
Shiyu Yan ◽  
Xiaohua Yang ◽  
Guodong Cheng ◽  
Hua Liu

For the verification test of some scientific calculation programs, different comparison methods are commonly applied to ensure the correctness of the computations. However, it is difficult to verify whether the testing output is correct, because the oracles which include the expected output are not always available or too hard to get. For this reason, the authors focus on using the Richardson Extrapolation to estimate the convergences of the numerical solution on different levels of mesh refinement. These numerical convergence properties can be applied to verification test, without the need for giving the oracles. In the present study, the authors take the program test of the multigroup neutron diffusion equations as a study case and propose the Richardson Extrapolation-based verification method. Three verification criterions are obtained based on our approach. In addition, a test experiment is conducted demonstrating the validity of our theoretical results.


Author(s):  
Robert C. Atwood ◽  
Andrew J. Bodey ◽  
Stephen W. T. Price ◽  
Mark Basham ◽  
Michael Drakopoulos

Tomographic datasets collected at synchrotrons are becoming very large and complex, and, therefore, need to be managed efficiently. Raw images may have high pixel counts, and each pixel can be multidimensional and associated with additional data such as those derived from spectroscopy. In time-resolved studies, hundreds of tomographic datasets can be collected in sequence, yielding terabytes of data. Users of tomographic beamlines are drawn from various scientific disciplines, and many are keen to use tomographic reconstruction software that does not require a deep understanding of reconstruction principles. We have developed Savu , a reconstruction pipeline that enables users to rapidly reconstruct data to consistently create high-quality results. Savu is designed to work in an ‘orthogonal’ fashion, meaning that data can be converted between projection and sinogram space throughout the processing workflow as required. The Savu pipeline is modular and allows processing strategies to be optimized for users' purposes. In addition to the reconstruction algorithms themselves, it can include modules for identification of experimental problems, artefact correction, general image processing and data quality assessment. Savu is open source, open licensed and ‘facility-independent’: it can run on standard cluster infrastructure at any institution.


1981 ◽  
Vol 14 (2) ◽  
pp. 1715-1720
Author(s):  
s. Koyama ◽  
K. Makino ◽  
N. Miki ◽  
Y. Iino ◽  
Y. Iseki

2018 ◽  
Vol 25 (1) ◽  
pp. 248-256
Author(s):  
Camila de Lima ◽  
Elias Salomão Helou

Iterative methods for tomographic image reconstruction have the computational cost of each iteration dominated by the computation of the (back)projection operator, which take roughlyO(N3) floating point operations (flops) forN×Npixels images. Furthermore, classical iterative algorithms may take too many iterations in order to achieve acceptable images, thereby making the use of these techniques unpractical for high-resolution images. Techniques have been developed in the literature in order to reduce the computational cost of the (back)projection operator toO(N2logN) flops. Also, incremental algorithms have been devised that reduce by an order of magnitude the number of iterations required to achieve acceptable images. The present paper introduces an incremental algorithm with a cost ofO(N2logN) flops per iteration and applies it to the reconstruction of very large tomographic images obtained from synchrotron light illuminated data.


Sign in / Sign up

Export Citation Format

Share Document