distributed cloud
Recently Published Documents


TOTAL DOCUMENTS

530
(FIVE YEARS 175)

H-INDEX

24
(FIVE YEARS 7)

2021 ◽  
Vol 2021 ◽  
pp. 1-5
Author(s):  
K. Mahalakshmi ◽  
K. Kousalya ◽  
Himanshu Shekhar ◽  
Aby K. Thomas ◽  
L. Bhagyalakshmi ◽  
...  

Cloud storage provides a potential solution replacing physical disk drives in terms of prominent outsourcing services. A threaten from an untrusted server affects the security and integrity of the data. However, the major problem between the data integrity and cost of communication and computation is directly proportional to each other. It is hence necessary to develop a model that provides the trade-off between the data integrity and cost metrics in cloud environment. In this paper, we develop an integrity verification mechanism that enables the utilisation of cryptographic solution with algebraic signature. The model utilises elliptic curve digital signature algorithm (ECDSA) to verify the data outsources. The study further resists the malicious attacks including forgery attacks, replacing attacks and replay attacks. The symmetric encryption guarantees the privacy of the data. The simulation is conducted to test the efficacy of the algorithm in maintaining the data integrity with reduced cost. The performance of the entire model is tested against the existing methods in terms of their communication cost, computation cost, and overhead cost. The results of simulation show that the proposed method obtains reduced computational of 0.25% and communication cost of 0.21% than other public auditing schemes.


Author(s):  
Leila Helali ◽  
◽  
Mohamed Nazih Omri

Since its emergence, cloud computing has continued to evolve thanks to its ability to present computing as consumable services paid by use, and the possibilities of resource scaling that it offers according to client’s needs. Models and appropriate schemes for resource scaling through consolidation service have been considerably investigated,mainly, at the infrastructure level to optimize costs and energy consumption. Consolidation efforts at the SaaS level remain very restrained mostly when proprietary software are in hand. In order to fill this gap and provide software licenses elastically regarding the economic and energy-aware considerations in the context of distributed cloud computing systems, this work deals with dynamic software consolidation in commercial cloud data centers 𝑫𝑺𝟑𝑪. Our solution is based on heuristic algorithms and allows reallocating software licenses at runtime by determining the optimal amount of resources required for their execution and freed unused machines. Simulation results showed the efficiency of our solution in terms of energy by 68.85% savings and costs by 80.01% savings. It allowed to free up to 75% physical machines and 76.5% virtual machines and proved its scalability in terms of average execution time while varying the number of software and the number of licenses alternately.


2021 ◽  
Author(s):  
Chen Chuqiao ◽  
S.B. Goyal

The modem data is collected by using IoT, stored in distributed cloud storage, and issued for data mining or training artificial intelligence. These new digital technologies integrate into the data middle platform have facilitated the progress of industry, promoted the fourth industrial revolution. And it also has caused challenges in security and privacy-preventing. The privacy data breach can happen in any phase of the Big-Data life cycle, and the Data Middle Platform also faces similar situations. How to make the privacy avoid leakage is exigency. The traditional privacy-preventing model is not enough, we need the help of Machine-Learning and the Blockchain. In this research, the researcher reviews the security and privacy-preventing in Big-Data, Machine Learning, Blockchain, and other related works at first. And then finding some gaps between the theory and the actual work. Based on these gaps, trying to create a suitable framework to guide the industry to protect their privacy when the organization contribute and operate their data middle platform. No only academicians, but also industry practitioners especially SMEs will get the benefit from this research.


Author(s):  
Ding Yu ◽  
Yuan Shixiong ◽  
Deng Rui ◽  
Luo Chenxiang

Based on the big data mining method of petrophysical data, this paper studies the method and application of BP neural network to establish nonlinear interpretation model in distributed cloud computing environment. The nonlinear mapping relationship between the relative objective logging response and actual formation component is established by extracting the data mining result model, which overcomes existing deficiencies of the conventional logging interpretation procedure based on the homogeneity theory, linear hypothesis and the use of statistical experience simplifying model and parameters. The results show that network prediction model has been improved and has superior reference value for solving practical problems of interpretation under complex geological conditions.


Author(s):  
Pradeep Nayak ◽  
Poornachandra S ◽  
Pawan J Acharya ◽  
Shravya ◽  
Shravani

Deduplication methods were designed to destroy copy information which bring about capacity of single duplicates of information as it were. Information Deduplication diminishes the circle space needed to store the back-ups in the extra room, tracks and kill the second duplicate of information inside the capacity unit. It permits as it were one case information event to be put away initially and afterward following occasions will be given reference pointer to the first information put away. In a Big information stockpiling climate, immense measure of information should be secure. For this legitimate administration, work, misrepresentation identification, investigation of information protection is an significant theme to be thought of. This paper inspects and assesses the common deduplication procedures and which are introduced in plain structure. In this review, it was seen that the secrecy and security of information has been undermined at numerous levels in common strategies for deduplication. Albeit much exploration is being done in different zones of distributed computing still work relating to this point is inadequate. To get rid of duplicate data which results in storage of single copies of data, data deduplication techniques were used. Data deduplication helps in decreasing storage capacity requirements and eliminates extra copies of same data inside storage unit. Proper management, work, fraud detection, analysis of data privacy are the topics to be considered in a big data storage environment, since, large amount of data needs to be secure. At many levels in general techniques for deduplication it is observed that safety of data and confidentiality has been compromised. Even though more research is being carried out in different areas of cloud computing still work related to this topic is little.


2021 ◽  
Vol 12 (4) ◽  
pp. 0-0

Representing an algorithmic workflow as a state machine is a frequently used technique in distributed systems. Replicating a state machine in a fault tolerant way is one of the main application areas under this context. When implementing a replicated state machine, a crucial problem is to maintain consistency among replicas that might handle various different requests arriving at each different replica. This problem requires maintaining a single consistent ordering of the distributed requests handled separately by replicas. Basic consensus protocols such as two phase commit (2PC), can be used to maintain consistency between replicas whenever a request is to be processed. In this study we modify 2PC protocol to take advantage of basic properties of a state machine and detect possible write conflicts earlier. Our experiments on distributed cloud environments show that our modified 2PC protocol increases the throughput and decrease wasted write operations by a significant amount.


Sign in / Sign up

Export Citation Format

Share Document