scholarly journals Three Dimensional Deconvolution Of Microscope Data

1997 ◽  
Vol 5 (5) ◽  
pp. 16-17
Author(s):  
Michael Richardson

Advances in computer processing power have enabled new threedimensional image enhancement and visualization techniques. Data storage capacities of a gigabyte or more are affordable and commonplace. The Power Macintosh computers are RISC based processors which perform graphical and computational tasks many times faster than the older 68000 based systems. These new systems are ideal platforms for threedimensional deconvolution of microscope data sets.

2020 ◽  
pp. paper46-1-paper46-10
Author(s):  
Ilya Rylskiy

During past 25 years, laser scanning has evolved from an experimental method into a fully autonomous family of Earth remote sensing methods. Now this group of methods provides the most accurate and detailed spatial data sets, while the cost of data is constantly falling, the number of measuring instruments (laser scanners) is constantly growing. The volumes of data that will be obtained during the surveys in the coming decades will allow the creation of the first sub-global coverage of the planet. However, the flip side of high accuracy and detail is the need to store fantastically large volumes of three-dimensional data without loss of accuracy. At the same time, the ability to work with the specified data in both 2D and 3D mode should be improved. Standard storage methods (file method, geodatabases, archiving, etc) solve the problem only partially. At the same time, there are some other alternative methods that can remove current restrictions and lead to the emergence of more flexible and functional spatial data infrastructures. One of the most flexible and promising ways of laser data storage and processing are quadtree and octree-based approaches. Of course, these approaches are more complicated than typical file data structures, that are commonly used for LIDAR data storage, but they allow users to solve some typical negative features of point datasets (processing speed, non-topological spatial structure, limited precision, etc.).


2017 ◽  
Vol 50 (5) ◽  
pp. 1267-1279 ◽  
Author(s):  
Patrick G. Callahan ◽  
McLean P. Echlin ◽  
Jean Charles Stinville ◽  
Tresa M. Pollock ◽  
Saransh Singh ◽  
...  

This paper applies the three-dimensional visualization techniques explored theoretically by Callahan, Echlin, Pollock, Singh & De Graef [J. Appl. Cryst.(2017),50, 430–440] to a series of experimentally acquired texture data sets, namely a sharp cube texture in a single-crystal Ni-based superalloy, a sharp Goss texture in single-crystal Nb, a random texture in a powder metallurgy polycrystalline René 88-DT alloy and a rolled plate texture in Ti-6Al-4V. Three-dimensional visualizations are shown (and made available as movies as supplementary material) using the Rodrigues, Euler and three-dimensional stereographic projection representations. In addition, it is shown that the true symmetry of Euler space, as derived from a mapping onto quaternion space, is described by the monoclinic color space groupPccin the Opechowski and Guccione nomenclature.


2018 ◽  
Vol 186 ◽  
pp. 02001 ◽  
Author(s):  
M. Buga ◽  
P. Fernique ◽  
C. Bot ◽  
M. G. Allen ◽  
F. Bonnarel ◽  
...  

High speed Internet and the evolution of data storage space in terms of cost-effectiveness has changed the way data are managed today. Large amounts of heterogeneous data can now be visualized easily and rapidly using interactive applications such as “Google Maps”. In this respect, the Hierarchical Progressive Survey (HiPS) method has been developed by the Centre de Données astronomiques de Strasbourg (CDS) since 2009. HiPS uses the hierarchical sky tessellation called HEALPix to describe and organize images, data cubes or source catalogs. These HiPS can be accessed and visualized using applications such as Aladin. We show that structuring the data using HiPS enables easy and quick access to large and complex sets of astronomical data. As with bibliographic and catalog data, full documentation and comprehensive metadata are absolutely required for pertinent usage of these data. Hence the role of documentalists in the process of producing HiPS is essential. We present the interaction between documentalists and other specialists who are all part of the CDS team and support this process. More precisely, we describe the tools used by the documentalists to generate HiPS or to update the Virtual Observatory standardized descriptive information (the “metadata”). We also present the challenges faced by the documentalists processing such heterogeneous data on the scales of megabytes up to petabytes. On one hand, documentalists at CDS manage small size textual or numerical data for one or few astronomical objects. On the other hand, they process large data sets such as big catalogs containing heterogeneous data like spectra, images or data cubes, for millions of astronomical objects. Finally, by participating in the development of an interactive visualization of images or three-dimensional data cubes using the HiPS method, documentalists contribute to a long-term management of complex, large astronomical data.


2018 ◽  
Vol 7 (3.31) ◽  
pp. 59
Author(s):  
N Deshai ◽  
S Venkataramana ◽  
I Hemalatha ◽  
G P. S. Varma

A latest tera to zeta era has been created during huge volume of data sets, which keep on collected from different social networks, machine to machine devices, google, yahoo, sensors etc. called as big data. Because day by day double the data storage size, data processing power, data availability and digital world data size in zeta bytes. Apache Hadoop is latest market weapon to handle huge volume of data sets by its most popular components like hdfs and mapreduce, to achieve an efficient storage ability and efficient processing on massive volume of data sets. To design an effective algorithm is a key factor for selecting nodes are important, to optimize and acquire high performance in Big data. An efficient and useful survey, overview, advantages and disadvantages of these scheduling algorithms provided also identified throughout this paper.  


Author(s):  
Mark Ellisman ◽  
Maryann Martone ◽  
Gabriel Soto ◽  
Eleizer Masliah ◽  
David Hessler ◽  
...  

Structurally-oriented biologists examine cells, tissues, organelles and macromolecules in order to gain insight into cellular and molecular physiology by relating structure to function. The understanding of these structures can be greatly enhanced by the use of techniques for the visualization and quantitative analysis of three-dimensional structure. Three projects from current research activities will be presented in order to illustrate both the present capabilities of computer aided techniques as well as their limitations and future possibilities.The first project concerns the three-dimensional reconstruction of the neuritic plaques found in the brains of patients with Alzheimer's disease. We have developed a software package “Synu” for investigation of 3D data sets which has been used in conjunction with laser confocal light microscopy to study the structure of the neuritic plaque. Tissue sections of autopsy samples from patients with Alzheimer's disease were double-labeled for tau, a cytoskeletal marker for abnormal neurites, and synaptophysin, a marker of presynaptic terminals.


Author(s):  
G. Jacobs ◽  
F. Theunissen

In order to understand how the algorithms underlying neural computation are implemented within any neural system, it is necessary to understand details of the anatomy, physiology and global organization of the neurons from which the system is constructed. Information is represented in neural systems by patterns of activity that vary in both their spatial extent and in the time domain. One of the great challenges to microscopists is to devise methods for imaging these patterns of activity and to correlate them with the underlying neuroanatomy and physiology. We have addressed this problem by using a combination of three dimensional reconstruction techniques, quantitative analysis and computer visualization techniques to build a probabilistic atlas of a neural map in an insect sensory system. The principal goal of this study was to derive a quantitative representation of the map, based on a uniform sample of afferents that was of sufficient size to allow statistically meaningful analyses of the relationships between structure and function.


Author(s):  
Yuancheng Li ◽  
Yaqi Cui ◽  
Xiaolong Zhang

Background: Advanced Metering Infrastructure (AMI) for the smart grid is growing rapidly which results in the exponential growth of data collected and transmitted in the device. By clustering this data, it can give the electricity company a better understanding of the personalized and differentiated needs of the user. Objective: The existing clustering algorithms for processing data generally have some problems, such as insufficient data utilization, high computational complexity and low accuracy of behavior recognition. Methods: In order to improve the clustering accuracy, this paper proposes a new clustering method based on the electrical behavior of the user. Starting with the analysis of user load characteristics, the user electricity data samples were constructed. The daily load characteristic curve was extracted through improved extreme learning machine clustering algorithm and effective index criteria. Moreover, clustering analysis was carried out for different users from industrial areas, commercial areas and residential areas. The improved extreme learning machine algorithm, also called Unsupervised Extreme Learning Machine (US-ELM), is an extension and improvement of the original Extreme Learning Machine (ELM), which realizes the unsupervised clustering task on the basis of the original ELM. Results: Four different data sets have been experimented and compared with other commonly used clustering algorithms by MATLAB programming. The experimental results show that the US-ELM algorithm has higher accuracy in processing power data. Conclusion: The unsupervised ELM algorithm can greatly reduce the time consumption and improve the effectiveness of clustering.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jonas Albers ◽  
Angelika Svetlove ◽  
Justus Alves ◽  
Alexander Kraupner ◽  
Francesca di Lillo ◽  
...  

AbstractAlthough X-ray based 3D virtual histology is an emerging tool for the analysis of biological tissue, it falls short in terms of specificity when compared to conventional histology. Thus, the aim was to establish a novel approach that combines 3D information provided by microCT with high specificity that only (immuno-)histochemistry can offer. For this purpose, we developed a software frontend, which utilises an elastic transformation technique to accurately co-register various histological and immunohistochemical stainings with free propagation phase contrast synchrotron radiation microCT. We demonstrate that the precision of the overlay of both imaging modalities is significantly improved by performing our elastic registration workflow, as evidenced by calculation of the displacement index. To illustrate the need for an elastic co-registration approach we examined specimens from a mouse model of breast cancer with injected metal-based nanoparticles. Using the elastic transformation pipeline, we were able to co-localise the nanoparticles to specifically stained cells or tissue structures into their three-dimensional anatomical context. Additionally, we performed a semi-automated tissue structure and cell classification. This workflow provides new insights on histopathological analysis by combining CT specific three-dimensional information with cell/tissue specific information provided by classical histology.


Atmosphere ◽  
2021 ◽  
Vol 12 (7) ◽  
pp. 906
Author(s):  
Ivan Bašták Ďurán ◽  
Martin Köhler ◽  
Astrid Eichhorn-Müller ◽  
Vera Maurer ◽  
Juerg Schmidli ◽  
...  

The single-column mode (SCM) of the ICON (ICOsahedral Nonhydrostatic) modeling framework is presented. The primary purpose of the ICON SCM is to use it as a tool for research, model evaluation and development. Thanks to the simplified geometry of the ICON SCM, various aspects of the ICON model, in particular the model physics, can be studied in a well-controlled environment. Additionally, the ICON SCM has a reduced computational cost and a low data storage demand. The ICON SCM can be utilized for idealized cases—several well-established cases are already included—or for semi-realistic cases based on analyses or model forecasts. As the case setup is defined by a single NetCDF file, new cases can be prepared easily by the modification of this file. We demonstrate the usage of the ICON SCM for different idealized cases such as shallow convection, stratocumulus clouds, and radiative transfer. Additionally, the ICON SCM is tested for a semi-realistic case together with an equivalent three-dimensional setup and the large eddy simulation mode of ICON. Such consistent comparisons across the hierarchy of ICON configurations are very helpful for model development. The ICON SCM will be implemented into the operational ICON model and will serve as an additional tool for advancing the development of the ICON model.


Sign in / Sign up

Export Citation Format

Share Document