D-buffer: irregular image data storage made practical

2013 ◽  
Vol 21 (1) ◽  
Author(s):  
J. Lipowski

AbstractModern hardware accelerated graphics pipelines are designed to operate on data in a so called streaming model. To process the data in this model one needs to impose some restrictions on input and output argument’s (most frequently represented by a two-dimensional frame buffer) memory structure. The output data regularity is obvious when we consider rasterizing hardware architecture, which draws 3D polygons using depth buffer to resolve the visible surface problem. But recently the user’s needs surpass those restrictions with increasing frequency. In this work we formulate and present new methods of irregular frame buffer storage and ordering. The so called deque buffer (or D-buffer) allows us to decrease the amount of memory used for storage as well as the memory latency cost by using pixel data ordering. Our findings are confirmed by experimental results that measure the processing time, which is up to four times shorter, when compared with previous work by other authors. We also include a detailed description of algorithms used for D-buffer construction on the last three consumer-grade graphics hardware architectures, as a guide for other researchers and a development aid for practitioners. The only theoretical requirement imposed by our method is the use of memory model with linear address space.

Author(s):  
Richard S. Chemock

One of the most common tasks in a typical analysis lab is the recording of images. Many analytical techniques (TEM, SEM, and metallography for example) produce images as their primary output. Until recently, the most common method of recording images was by using film. Current PS/2R systems offer very large capacity data storage devices and high resolution displays, making it practical to work with analytical images on PS/2s, thereby sidestepping the traditional film and darkroom steps. This change in operational mode offers many benefits: cost savings, throughput, archiving and searching capabilities as well as direct incorporation of the image data into reports.The conventional way to record images involves film, either sheet film (with its associated wet chemistry) for TEM or PolaroidR film for SEM and light microscopy. Although film is inconvenient, it does have the highest quality of all available image recording techniques. The fine grained film used for TEM has a resolution that would exceed a 4096x4096x16 bit digital image.


Author(s):  
Klaus-Ruediger Peters

Differential hysteresis processing is a new image processing technology that provides a tool for the display of image data information at any level of differential contrast resolution. This includes the maximum contrast resolution of the acquisition system which may be 1,000-times higher than that of the visual system (16 bit versus 6 bit). All microscopes acquire high precision contrasts at a level of <0.01-25% of the acquisition range in 16-bit - 8-bit data, but these contrasts are mostly invisible or only partially visible even in conventionally enhanced images. The processing principle of the differential hysteresis tool is based on hysteresis properties of intensity variations within an image.Differential hysteresis image processing moves a cursor of selected intensity range (hysteresis range) along lines through the image data reading each successive pixel intensity. The midpoint of the cursor provides the output data. If the intensity value of the following pixel falls outside of the actual cursor endpoint values, then the cursor follows the data either with its top or with its bottom, but if the pixels' intensity value falls within the cursor range, then the cursor maintains its intensity value.


2020 ◽  
Vol 12 (20) ◽  
pp. 3341
Author(s):  
Ryan L. Crumley ◽  
Ross T. Palomaki ◽  
Anne W. Nolin ◽  
Eric A. Sproles ◽  
Eugene J. Mar

Snow is a critical component of the climate system, provides fresh water for millions of people globally, and affects forest and wildlife ecology. Snowy regions are typically data sparse, especially in mountain environments. Remotely-sensed snow cover data are available globally but are challenging to convert into accessible, actionable information. SnowCloudMetrics is a web portal for on-demand production and delivery of snow information including snow cover frequency (SCF) and snow disappearance date (SDD) using Google Earth Engine (GEE). SCF and SDD are computed using the Moderate Resolution Imaging Spectroradiometer (MODIS) Snow Cover Binary 500 m (MOD10A1) product. The SCF and SDD metrics are assessed using 18 years of Snow Telemetry records at more than 750 stations across the Western U.S. SnowCloudMetrics provides users with the capacity to quickly and efficiently generate local-to-global scale snow information. It requires no user-side data storage or computing capacity, and needs little in the way of remote sensing expertise. SnowCloudMetrics allows users to subset by year, watershed, elevation range, political boundary, or user-defined region. Users can explore the snow information via a GEE map interface and, if desired, download scripts for access to tabular and image data in non-proprietary formats for additional analyses. We present global and hemispheric scale examples of SCF and SDD. We also provide a watershed example in the transboundary, snow-dominated Amu Darya Basin. Our approach represents a new, user-driven paradigm for access to snow information. SnowCloudMetrics benefits snow scientists, water resource managers, climate scientists, and snow related industries providing SCF and SDD information tailored to their needs, especially in data sparse regions.


2019 ◽  
Vol 147 (2) ◽  
pp. 677-689 ◽  
Author(s):  
Peter D. Düben ◽  
Martin Leutbecher ◽  
Peter Bauer

Abstract Data storage and data processing generate significant cost for weather and climate modeling centers. The volume of data that needs to be stored and data that are disseminated to end users increases with increasing model resolution and the use of larger forecast ensembles. If precision of data is reduced, cost can be reduced accordingly. In this paper, three new methods to allow a reduction in precision with minimal loss of information are suggested and tested. Two of these methods rely on the similarities between ensemble members in ensemble forecasts. Therefore, precision will be high at the beginning of forecasts when ensemble members are more similar, to provide sufficient distinction, and decrease with increasing ensemble spread. To keep precision high for predictable situations and low elsewhere appears to be a useful approach to optimize data storage in weather forecasts. All methods are tested with data of operational weather forecasts of the European Centre for Medium-Range Weather Forecasts.


2015 ◽  
Vol 742 ◽  
pp. 721-725
Author(s):  
Xiao Qing Zhou ◽  
Jia Xiu Sun ◽  
Xing Xian Luo

With fast development and deep appliance of the Internet, problem of mass image data storage stand out, so the problem of low management efficiency, low storage ability and high cost of traditional storage framework has appeared. The appearance of Hadoop provides a new thought. However, Hadoop itself is not suit for the handle of small files. This paper puts forward a storage framework of mass image files based on Hadoop, and solved the internal storage bottleneck of NameNode when small files are excessive through classification algorithm of preprocessing module and lead-in of high efficiency and first-level of index mechanism. The test manifests that the system is safe, easy to defend and has fine extension quality; as a result, it can reach to a fine effect.


2012 ◽  
Vol 463-464 ◽  
pp. 1701-1705
Author(s):  
Yu Lan Wang ◽  
Jian Xiong Wang ◽  
Yao Hui Li

This paper designs a stable, efficient and intelligent video surveillance system, which is based on the review and analysis of the domestic and internation. And it is related research work on the basis of the intelligent security monitoring. The system is used by the Web server and database server model, and it can detect the moving object in scene of monitor. Firstly this paper analyzes the structure of server system, it uses B/S, database, ActiveX technology and is completed finally. Secondly this paper realizes the video image data in bulk storage, read, update and maintenance by using the database. The bmp format will be converted to JPG format effectively to realize the image database compression. It is valued to the video image data storage and management. Finally, the accuracy and response time can be improved in moving object detection, which is based on background subtraction method


2017 ◽  
Author(s):  
◽  
Susan D Shenkin ◽  
Cyril Pernet ◽  
Thomas E Nichols ◽  
Jean-Baptiste Poline ◽  
...  

AbstractBrain imaging is now ubiquitous in clinical practice and research. The case for bringing together large amounts of image data from well-characterised healthy subjects and those with a range of common brain diseases across the life course is now compelling. This report follows a meeting of international experts from multiple disciplines, all interested in brain image biobanking. The meeting included neuroimaging experts (clinical and non-clinical), computer scientists, epidemiologists, clinicians, ethicists, and lawyers involved in creating brain image banks. The meeting followed a structured format to discuss current and emerging brain image banks; applications such as atlases; conceptual and statistical problems (e.g. defining ‘normality’); legal, ethical and technological issues (e.g. consents, potential for data linkage, data security, harmonisation, data storage and enabling of research data sharing). We summarise the lessons learned from the experiences of a wide range of individual image banks, and provide practical recommendations to enhance creation, use and reuse of neuroimaging data. Our aim is to maximise the benefit of the image data, provided voluntarily by research participants and funded by many organisations, for human health. Our ultimate vision is of a federated network of brain image biobanks accessible for large studies of brain structure and function.


Sign in / Sign up

Export Citation Format

Share Document