The Research about No-Vibration Data Storage of Micro-Scale and Long-Term Ocean Turbulence Measurement

2013 ◽  
Vol 347-350 ◽  
pp. 803-807
Author(s):  
Xin Luan ◽  
Bing Xue ◽  
Feng Mei Sun ◽  
Qi Zhi Yan ◽  
Da Lei Song

Turbulence played an important role in the evolution of the seawater energy and exchange. Multi-scale, long-term, fixed-point and continuous sampling is a new research direction in the turbulence observation. This dissertation designed high-capacity and no-vibration data storage solutions aiming at long-term, continuous turbulence observations. First a multi-scale submerged buoy observing platform is designed. Base on the turbulence observing platform, a multi-parameter data acquisition and no vibrations storage system is designed. This paper describes the hardware and software design implementation of large-capacity data storage arrays in details as well as the readability and easy operation of the transplant of FatFS. Actual test and sea trial prove the design can be achieved large-capacity data access of long-term observation of ocean turbulence base on the submerged buoy.

2021 ◽  
Vol 1 ◽  
pp. 80
Author(s):  
Thijs Devriendt ◽  
Clemens Ammann ◽  
Folkert W. Asselbergs ◽  
Alexander Bernier ◽  
Rodrigo Costas ◽  
...  

Various data sharing platforms are being developed to enhance the sharing of cohort data by addressing the fragmented state of data storage and access systems. However, policy challenges in several domains remain unresolved. The euCanSHare workshop was organized to identify and discuss these challenges and to set the future research agenda. Concerns over the multiplicity and long-term sustainability of platforms, lack of resources, access of commercial parties to medical data, credit and recognition mechanisms in academia and the organization of data access committees are outlined. Within these areas, solutions need to be devised to ensure an optimal functioning of platforms.


2017 ◽  
Vol 51 (4) ◽  
pp. 12-22 ◽  
Author(s):  
Xiuyan Liu ◽  
Xin Luan ◽  
Z. Daniel Deng ◽  
Dalei Song ◽  
Shengbo Zang ◽  
...  

AbstractAn autonomous Moored Reciprocating Vertical Profiler (MRVP) has been developed and tested for measuring ocean turbulence. The MRVP is designed to combine the advantages of long-term moored measurements at specified depths with those of short-term ship-supported continuous profiling performed at high vertical resolution. The profiler is programmed to repeat vertical motions autonomously along the mooring cable based on a buoyancy-driven mechanism. A sea trial has been conducted in the South China Sea to evaluate the performance of the profiler. The shear probe data are unreliable when the flow past sensors is not sufficiently greater than an estimate of turbulent velocity. For 65% of the dataset, turbulence measurements are of high quality and the magnitude of dissipation rates is up to O(10−10) W kg−1. To minimize the contamination induced by instrument vibration and improve the estimation of turbulent kinetic energy terms, an advanced cross-spectrum algorithm is implemented to the measured shear data. The corrected spectra agreed well with the empirical Nasmyth spectrum, and dissipation rates had averagely decreased a factor of 2 and 8 times lower than the raw spectra. The autonomous MRVP is proven to be a stable platform, and the novel upward measurement provides a new perspective for measuring long-term time series of turbulence mixing.


2021 ◽  
Vol 1 ◽  
pp. 80
Author(s):  
Thijs Devriendt ◽  
Clemens Ammann ◽  
Folkert W. Asselberghs ◽  
Alexander Bernier ◽  
Rodrigo Costas ◽  
...  

Various data sharing platforms are being developed to enhance the sharing of cohort data by addressing the fragmented state of data storage and access systems. However, policy challenges in several domains remain unresolved. The euCanSHare workshop was organized to identify and discuss these challenges and to set the future research agenda. Concerns over the multiplicity and long-term sustainability of platforms, lack of resources, access of commercial parties to medical data, credit and recognition mechanisms in academia and the organization of data access committees are outlined. Within these areas, solutions need to be devised to ensure an optimal functioning of platforms.


Author(s):  
Richard S. Chemock

One of the most common tasks in a typical analysis lab is the recording of images. Many analytical techniques (TEM, SEM, and metallography for example) produce images as their primary output. Until recently, the most common method of recording images was by using film. Current PS/2R systems offer very large capacity data storage devices and high resolution displays, making it practical to work with analytical images on PS/2s, thereby sidestepping the traditional film and darkroom steps. This change in operational mode offers many benefits: cost savings, throughput, archiving and searching capabilities as well as direct incorporation of the image data into reports.The conventional way to record images involves film, either sheet film (with its associated wet chemistry) for TEM or PolaroidR film for SEM and light microscopy. Although film is inconvenient, it does have the highest quality of all available image recording techniques. The fine grained film used for TEM has a resolution that would exceed a 4096x4096x16 bit digital image.


Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3281
Author(s):  
Xu He ◽  
Yong Yin

Recently, deep learning-based techniques have shown great power in image inpainting especially dealing with squared holes. However, they fail to generate plausible results inside the missing regions for irregular and large holes as there is a lack of understanding between missing regions and existing counterparts. To overcome this limitation, we combine two non-local mechanisms including a contextual attention module (CAM) and an implicit diversified Markov random fields (ID-MRF) loss with a multi-scale architecture which uses several dense fusion blocks (DFB) based on the dense combination of dilated convolution to guide the generative network to restore discontinuous and continuous large masked areas. To prevent color discrepancies and grid-like artifacts, we apply the ID-MRF loss to improve the visual appearance by comparing similarities of long-distance feature patches. To further capture the long-term relationship of different regions in large missing regions, we introduce the CAM. Although CAM has the ability to create plausible results via reconstructing refined features, it depends on initial predicted results. Hence, we employ the DFB to obtain larger and more effective receptive fields, which benefits to predict more precise and fine-grained information for CAM. Extensive experiments on two widely-used datasets demonstrate that our proposed framework significantly outperforms the state-of-the-art approaches both in quantity and quality.


GigaScience ◽  
2020 ◽  
Vol 9 (10) ◽  
Author(s):  
Daniel Arend ◽  
Patrick König ◽  
Astrid Junker ◽  
Uwe Scholz ◽  
Matthias Lange

Abstract Background The FAIR data principle as a commitment to support long-term research data management is widely accepted in the scientific community. Although the ELIXIR Core Data Resources and other established infrastructures provide comprehensive and long-term stable services and platforms for FAIR data management, a large quantity of research data is still hidden or at risk of getting lost. Currently, high-throughput plant genomics and phenomics technologies are producing research data in abundance, the storage of which is not covered by established core databases. This concerns the data volume, e.g., time series of images or high-resolution hyper-spectral data; the quality of data formatting and annotation, e.g., with regard to structure and annotation specifications of core databases; uncovered data domains; or organizational constraints prohibiting primary data storage outside institional boundaries. Results To share these potentially dark data in a FAIR way and master these challenges the ELIXIR Germany/de.NBI service Plant Genomic and Phenomics Research Data Repository (PGP) implements a “bring the infrastructure to the data” approach, which allows research data to be kept in place and wrapped in a FAIR-aware software infrastructure. This article presents new features of the e!DAL infrastructure software and the PGP repository as a best practice on how to easily set up FAIR-compliant and intuitive research data services. Furthermore, the integration of the ELIXIR Authentication and Authorization Infrastructure (AAI) and data discovery services are introduced as means to lower technical barriers and to increase the visibility of research data. Conclusion The e!DAL software matured to a powerful and FAIR-compliant infrastructure, while keeping the focus on flexible setup and integration into existing infrastructures and into the daily research process.


Sign in / Sign up

Export Citation Format

Share Document