Remote audit for large data-based algebraic signature DCT cloud storage

Measurement ◽  
2019 ◽  
Vol 143 ◽  
pp. 22-26
Author(s):  
Zheng Rujia ◽  
Yu Ziya ◽  
Wang Zhenkai
2013 ◽  
Vol 422 ◽  
pp. 260-265 ◽  
Author(s):  
Shu Li Chen ◽  
Lin Hua Zhou ◽  
Zhong Xiu Yang

In this day and age, there is a disruptive shift in technology concerning the powerful ability of cloud storage that back up and store large data in the cloud, as well as syncing it across multiple devices automatically, and its drawbacks in data security. This paper mainly presents an overview of cloud storage, highlights many of security issues and vulnerabilities in cloud storage, lists various solutions, and then gives a summary.


Author(s):  
Hariyo Sasongko ◽  
T Yudi Hadiwandra

At this time, computer systems and networks are an essential part of human life. The number can see this of computer users in the office or on campus, and at school. This need has reached a relatively large number. To access data, many computer users in the network continue to increase, impacting the selection of servers, and large data storage media is necessary. Currently, the storage media that is often used to store data still uses physical media such as hard disks, flash disks that are virus-prone and easy to lose. Another problem also arises when physical storage has a relatively high price compared to cloud storage that can be accessed anywhere when needed.


2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
Klaithem Al Nuaimi ◽  
Nader Mohamed ◽  
Mariam Al Nuaimi ◽  
Jameela Al-Jaroodi

We present a novel approach to solve the cloud storage issues and provide a fast load balancing algorithm. Our approach is based on partitioning and concurrent dual direction download of the files from multiple cloud nodes. Partitions of the files are saved on the cloud rather than the full files, which provide a good optimization to the cloud storage usage. Only partial replication is used in this algorithm to ensure the reliability and availability of the data. Our focus is to improve the performance and optimize the storage usage by providing the DaaS on the cloud. This algorithm solves the problem of having to fully replicate large data sets, which uses up a lot of precious space on the cloud nodes. Reducing the space needed will help in reducing the cost of providing such space. Moreover, performance is also increased since multiple cloud servers will collaborate to provide the data to the cloud clients in a faster manner.


2018 ◽  
Vol 7 (2.8) ◽  
pp. 379
Author(s):  
D Sowmia ◽  
B Muruganantham

Distributed storage systems give dependable access to information through excess spread over independently unreliable hubs. Application scenarios incorporate server farms, distributed capacity frameworks, and capacity in remote systems. This paper gives a study on the cloud storage model of networked online storage where data is stored in virtualized pools of storage which are generally hosted by third parties. Hosting companies operate large data centersand people who require their data to be encouraged buy or lease accumulating limit from them. The server cultivate overseers, outside of anyone's ability to see, virtualize the advantages according to the necessities of the customer and reveal them as limit pools, which the customers would themselves have the capacity to use to store records or data objects. . The data is stored across various locations, when the user wants to retrieve them, it could be done by any of the encryption methods. At last, in view of existing procedures, promising future research bearings are recommended.


Author(s):  
John A. Hunt

Spectrum-imaging is a useful technique for comparing different processing methods on very large data sets which are identical for each method. This paper is concerned with comparing methods of electron energy-loss spectroscopy (EELS) quantitative analysis on the Al-Li system. The spectrum-image analyzed here was obtained from an Al-10at%Li foil aged to produce δ' precipitates that can span the foil thickness. Two 1024 channel EELS spectra offset in energy by 1 eV were recorded and stored at each pixel in the 80x80 spectrum-image (25 Mbytes). An energy range of 39-89eV (20 channels/eV) are represented. During processing the spectra are either subtracted to create an artifact corrected difference spectrum, or the energy offset is numerically removed and the spectra are added to create a normal spectrum. The spectrum-images are processed into 2D floating-point images using methods and software described in [1].


Author(s):  
Thomas W. Shattuck ◽  
James R. Anderson ◽  
Neil W. Tindale ◽  
Peter R. Buseck

Individual particle analysis involves the study of tens of thousands of particles using automated scanning electron microscopy and elemental analysis by energy-dispersive, x-ray emission spectroscopy (EDS). EDS produces large data sets that must be analyzed using multi-variate statistical techniques. A complete study uses cluster analysis, discriminant analysis, and factor or principal components analysis (PCA). The three techniques are used in the study of particles sampled during the FeLine cruise to the mid-Pacific ocean in the summer of 1990. The mid-Pacific aerosol provides information on long range particle transport, iron deposition, sea salt ageing, and halogen chemistry.Aerosol particle data sets suffer from a number of difficulties for pattern recognition using cluster analysis. There is a great disparity in the number of observations per cluster and the range of the variables in each cluster. The variables are not normally distributed, they are subject to considerable experimental error, and many values are zero, because of finite detection limits. Many of the clusters show considerable overlap, because of natural variability, agglomeration, and chemical reactivity.


Author(s):  
Hakan Ancin

This paper presents methods for performing detailed quantitative automated three dimensional (3-D) analysis of cell populations in thick tissue sections while preserving the relative 3-D locations of cells. Specifically, the method disambiguates overlapping clusters of cells, and accurately measures the volume, 3-D location, and shape parameters for each cell. Finally, the entire population of cells is analyzed to detect patterns and groupings with respect to various combinations of cell properties. All of the above is accomplished with zero subjective bias.In this method, a laser-scanning confocal light microscope (LSCM) is used to collect optical sections through the entire thickness (100 - 500μm) of fluorescently-labelled tissue slices. The acquired stack of optical slices is first subjected to axial deblurring using the expectation maximization (EM) algorithm. The resulting isotropic 3-D image is segmented using a spatially-adaptive Poisson based image segmentation algorithm with region-dependent smoothing parameters. Extracting the voxels that were labelled as "foreground" into an active voxel data structure results in a large data reduction.


1980 ◽  
Vol 19 (04) ◽  
pp. 187-194
Author(s):  
J.-Ph. Berney ◽  
R. Baud ◽  
J.-R. Scherrer

It is well known that Frame Selection Systems (FFS) have proved both popular and effective in physician-machine and patient-machine dialogue. A formal algorithm for definition of a Frame Selection System for handling man-machine dialogue is presented here. Besides, it is shown how the natural medical language can be handled using the approach of a tree branching logic. This logic appears to be based upon ordered series of selections which enclose a syntactic structure. The external specifications are discussed with regard to convenience and efficiency. Knowing that all communication between the user and the application programmes is handled only by FSS software, FSS contributes to achieving modularity and, therefore, also maintainability in a transaction-oriented system with a large data base and concurrent accesses.


Sign in / Sign up

Export Citation Format

Share Document