scholarly journals New developments in data storage

1982 ◽  
Vol 64 ◽  
pp. 79-83
Author(s):  
P.J. Grosbøl

AbstactIt is estimated that up to 100 Gbytes of primary data from digital detectors have to be stored each year. The amount of reduced data is at least one order of magnitude smaller. Although the storage density for magnetic recording can be made higher, only the optical technic can provide a substantial denser medium than the photographic emulsion. It seems likely that optical read-only devices will be developed for archival storage of data in this decade. Magnetic recording of data will still be preferred whenever changing of the information is important.

MRS Bulletin ◽  
1990 ◽  
Vol 15 (3) ◽  
pp. 23-25 ◽  
Author(s):  
Ami Berkowitz

For more than 40 years, magnetic recording has been the dominant technology for electronic data storage. During this time, the areal storage density on disks has risen to >108 bits/cm2. On tapes the corresponding figure is 0.2 × 108 bits/cm2. Thus each bit uses about a 1.0 μm2 area. These bits are written and read at data rates that require head-disk relative speeds of tens of meters per second and head-tape relative speeds of several meters per second. All this is accomplished at head-disk spacings of ≈0.2 μm and with contact recording for tapes.It is truly a wonder that the systems work as well as they do. In fact, for many features in magnetic recording systems it isn't certain why they work as well as they do. However, the demand for storage capacity is estimated to be increasing at about 40% per year. So it is natural to ask whether magnetic recording can maintain its present dominant position in the foreseeable future. The answer is — “Very likely, yes” — but this prediction is based on the assumption that a number of formidable fascinating problems will be solved in order to increase the areal bit density.The five articles in this special issue present the state-of-the-art in those key areas of magnetic recording that involve materials science, and they define the problems involved in increasing storage density. James U. Lemke discusses the background and outlook for magnetic recording.


GigaScience ◽  
2020 ◽  
Vol 9 (10) ◽  
Author(s):  
Daniel Arend ◽  
Patrick König ◽  
Astrid Junker ◽  
Uwe Scholz ◽  
Matthias Lange

Abstract Background The FAIR data principle as a commitment to support long-term research data management is widely accepted in the scientific community. Although the ELIXIR Core Data Resources and other established infrastructures provide comprehensive and long-term stable services and platforms for FAIR data management, a large quantity of research data is still hidden or at risk of getting lost. Currently, high-throughput plant genomics and phenomics technologies are producing research data in abundance, the storage of which is not covered by established core databases. This concerns the data volume, e.g., time series of images or high-resolution hyper-spectral data; the quality of data formatting and annotation, e.g., with regard to structure and annotation specifications of core databases; uncovered data domains; or organizational constraints prohibiting primary data storage outside institional boundaries. Results To share these potentially dark data in a FAIR way and master these challenges the ELIXIR Germany/de.NBI service Plant Genomic and Phenomics Research Data Repository (PGP) implements a “bring the infrastructure to the data” approach, which allows research data to be kept in place and wrapped in a FAIR-aware software infrastructure. This article presents new features of the e!DAL infrastructure software and the PGP repository as a best practice on how to easily set up FAIR-compliant and intuitive research data services. Furthermore, the integration of the ELIXIR Authentication and Authorization Infrastructure (AAI) and data discovery services are introduced as means to lower technical barriers and to increase the visibility of research data. Conclusion The e!DAL software matured to a powerful and FAIR-compliant infrastructure, while keeping the focus on flexible setup and integration into existing infrastructures and into the daily research process.


2019 ◽  
Vol 3 (25) ◽  
pp. 249-258 ◽  
Author(s):  
Dmitri Litvinov ◽  
Chunsheng E ◽  
Vishal Parekh ◽  
Darren Smith ◽  
James Rantschler ◽  
...  

2002 ◽  
Vol 721 ◽  
Author(s):  
Bo Cheng ◽  
Kun Yang ◽  
B. L. Justus ◽  
W. J. Yeh

AbstractIn magnetic recording technology, barriers based on fundamental physical limits on the data density are being approached for the current longitudinal recording modes. However, demands for higher data storage density have escalated in recent years. Discrete perpendicular recording is a viable method to achieve 100 Gb per square inch and beyond. We report on the development of a novel technique to fabricate uniform arrays of nano-sized magnetic dots. Uniform arrays of nanometer-sized magnetic dots are obtained by magnetron sputtering deposition through a nanochannel glass replica mask. The platinum replica masks are fabricated using thin film deposition on etched nanochannel glass and contain uniform hexagonally patterned voids with diameters as small as 50 nanometers. The magnetic dot density can be as high as 1011 per square inch. Our method provides a simple yet effective way to create regularly arranged discrete magnetic media that can be used for perpendicular magnetic recording. The magnetic properties of the dots are studied with a vibrating sample magnetometer.


2005 ◽  
Vol 44 (02) ◽  
pp. 149-153 ◽  
Author(s):  
F. Estrella ◽  
C. del Frate ◽  
T. Hauer ◽  
M. Odeh ◽  
D. Rogulin ◽  
...  

Summary Objectives: The past decade has witnessed order of magnitude increases in computing power, data storage capacity and network speed, giving birth to applications which may handle large data volumes of increased complexity, distributed over the internet. Methods: Medical image analysis is one of the areas for which this unique opportunity likely brings revolutionary advances both for the scientist’s research study and the clinician’s everyday work. Grids [1] computing promises to resolve many of the difficulties in facilitating medical image analysis to allow radiologists to collaborate without having to co-locate. Results: The EU-funded MammoGrid project [2] aims to investigate the feasibility of developing a Grid-enabled European database of mammograms and provide an information infrastructure which federates multiple mammogram databases. This will enable clinicians to develop new common, collaborative and co-operative approaches to the analysis of mammographic data. Conclusion: This paper focuses on one of the key requirements for large-scale distributed mammogram analysis: resolving queries across a grid-connected federation of images.


Environments ◽  
2018 ◽  
Vol 5 (11) ◽  
pp. 113 ◽  
Author(s):  
Vasco Chiteculo ◽  
Bohdan Lojka ◽  
Peter Surový ◽  
Vladimir Verner ◽  
Dimitrios Panagiotidis ◽  
...  

Forest degradation and forest loss threaten the survival of many species and reduce the ability of forests to provide vital services. Clearing for agriculture in Angola is an important driver of forest degradation and deforestation. Charcoal production for urban consumption as a driver of forest degradation has had alarming impacts on natural forests, as well as on the social and economic livelihood of the rural population. The charcoal impact on forest cover change is in the same order of magnitude as deforestation caused by agricultural expansion. However, there is a need to monitor the linkage between charcoal production and forest degradation. The aim of this paper is to investigate the sequence of the charcoal value chain as a systematic key to identify policies to reduce forest degradation in the province of Bié. It is a detailed study of the charcoal value chain that does not stop on the production and the consumption side. The primary data of this study came from 330 respondents obtained through different methods (semi-structured questionnaire survey and market observation conducted in June to September 2013–2014). A logistic regression (logit) model in IBM SPSS Statistics 24 (IBM Corp, Armonk, NY, USA) was used to analyze the factors influencing the decision of the households to use charcoal for domestic purposes. The finding indicates that 21 to 27 thousand hectares were degraded due to charcoal production. By describing the chain of charcoal, it was possible to access the driving factors for charcoal production and to obtain the first-time overview flow of charcoal from producers to consumers in Bié province. The demand for charcoal in this province is more likely to remain strong if government policies do not aim to employ alternative sources of domestic energy.


Sign in / Sign up

Export Citation Format

Share Document