data integrity
Recently Published Documents


TOTAL DOCUMENTS

1413
(FIVE YEARS 540)

H-INDEX

30
(FIVE YEARS 12)

2022 ◽  
Vol 2022 ◽  
pp. 1-8
Author(s):  
You Wu Liu ◽  
Syazwina Binti Alias ◽  
Ming-yue Liu ◽  
Bian-bian Jiao

This paper decomposes the routing process of industrial robot network using the application of analytic hierarchy process in decision-making. The influence of four factors, such as path length, data integrity, energy consumption, and receiving delay, on routing effect is analyzed. Simultaneous interpreting routes are selected to achieve the purpose of routing. Simulation results show that this method can more comprehensively consider the factors affecting routing and is superior to the existing methods in terms of energy consumption, data integrity, and transmission delay.


2022 ◽  
pp. 119-140
Author(s):  
Tiziano Volpentesta ◽  
Mario Miozza ◽  
Abhijeet Satwekar

Biopharmaceutical companies and health authorities continuously exchange information to provide safe and effective therapeutics. The interactions between the two require transparency and extensive documentation exchange concerning the processes which extend from the development through the manufacturing phase. The current processes rely on paper documentation, notebooks, and point-to-point electronic data interchange (EDI) for the storage of data. Thereby, generating challenges of data integrity within the internal siloed structures and the traceability of the medicinal products in the pursuit to avoid counterfeiting. With Industry 4.0 and blockchain, the authors envisioned a reinvented workflow that helps to 1) manage data integrity with decentralized trust and 2) improve the track and trace capabilities. Hence, biopharmaceutical companies can manage data in a more trustable manner while maintaining security and privacy, further enabling the external ecosystem with track and trace to ensure complete transparency until the therapeutics reach patients.


Author(s):  
Pallapu Himavanth Reddy

Abstract: Cloud computing provides customers with storage as a service, allowing data to be stored, managed, and cached remotely. Users can also access it online. A major concern for users is the integrity of the data stored in the cloud, as it is possible for external invaders or criminals to attack, repair, or destroy the data stored in the cloud. Data auditing is a trending concept that involves hiring a third-party auditor to perform a data integrity test (TPA). The main purpose of this project is to provide a safe and effective testing system that combines features such as data integrity, confidentiality, and privacy protection. The cloud server is only used to store encrypted data blocks in the proposed system. It is not subject to any additional computer verification. TPA and the data owner are in charge of all the functions of the scheme. A variety of materials are used to evaluate the proposed audit process. The proposed solution meets all the processes while minimizing the load on cloud servers. Data dynamics actions such as data review, deletion, and installation will be performed in the future. Keywords: Cloud storage; Third Party Auditor; Public Auditing; Privacy Preserving; Integrity;


2021 ◽  
Author(s):  
A. K. JAITHUNBI ◽  
S. SABENA ◽  
L. SAIRAMESH

Abstract Today’s internet world is moves to cloud computing to maintain their public data privately in a secure way. In cloud scenario, many security principles are implemented to maintain the secure transmission of data over the internet. And still, the main concern is about maintaining the integrity of our own data in public cloud. Mostly, research works concentrates on cryptographic techniques for secure sharing of data but there is no such mentioned works are available for data integrity. In this paper, a data masking technique called obfuscation is implemented which is used to protect the data from unwanted modification by data breaching attacks. In this work, enhanced Vigenere encryption is used to perform obfuscation that maintains the privacy of the user’s data. Enhanced Vigenere encryption algorithm combined with intelligent rules to maintain the dissimilarity between the data masking for perform encryption with different set of rules. This work mainly concentrates on data privacy with reduced time complexity for encryption and decryption.


2021 ◽  
Vol 2021 ◽  
pp. 1-5
Author(s):  
K. Mahalakshmi ◽  
K. Kousalya ◽  
Himanshu Shekhar ◽  
Aby K. Thomas ◽  
L. Bhagyalakshmi ◽  
...  

Cloud storage provides a potential solution replacing physical disk drives in terms of prominent outsourcing services. A threaten from an untrusted server affects the security and integrity of the data. However, the major problem between the data integrity and cost of communication and computation is directly proportional to each other. It is hence necessary to develop a model that provides the trade-off between the data integrity and cost metrics in cloud environment. In this paper, we develop an integrity verification mechanism that enables the utilisation of cryptographic solution with algebraic signature. The model utilises elliptic curve digital signature algorithm (ECDSA) to verify the data outsources. The study further resists the malicious attacks including forgery attacks, replacing attacks and replay attacks. The symmetric encryption guarantees the privacy of the data. The simulation is conducted to test the efficacy of the algorithm in maintaining the data integrity with reduced cost. The performance of the entire model is tested against the existing methods in terms of their communication cost, computation cost, and overhead cost. The results of simulation show that the proposed method obtains reduced computational of 0.25% and communication cost of 0.21% than other public auditing schemes.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0260697
Author(s):  
Bin Huang ◽  
You Tang

Background Along with the vigorous development of Internet technology, increasing the functions of the various types of equipment, network communication easy and diversity, at the same time, the amount of data is very huge, under the network bandwidth limitations, through long lead to a data need to be cut into more, one by one, transfer times, information delay problems. Results Aiming at the problems of poor data integrity, low efficiency and poor serialization efficiency of traditional data storage information, this article introduces Protobuf technology to study the serialization of data storage information. The serpentine gap method is used to complete the allocation interval of the sequence nodes, so that the working state and the resting state always maintain a dynamic balance. According to the first-level rules, the storage data of the completed target node is obtained, and the grammatical structure and the semantics of the target data are analyzed, Meanwhile corresponding correspondences are established, and the data storage information is serialized. In order to verify the effectiveness of Protobuf’s data storage information serialization method, a comparative experiment is designed. By using three methods of HDVM, Redis and Protobuf to serialize JSON data, the comparative analysis shows that HDVM has the longest processing time and Protobuf has the shortest processing time, and the data integrity is not affected. The simulation data shows that the Protobuf serialization method has short conversion time, high space utilization, and the Obvious advantages in correctness and integrity. It is vary suitable for serialization of JSON data in the case of limited bandwidth.


2021 ◽  
Author(s):  
◽  
Chung Yup Kim

<p>Decentralised technology backed by blockchain has gained popularity in recent years, as it secures autonomous ecosystems without the need for a central authority. The blockchain concept originated in the financial domain using cryptocurrency but has been applied to a variety of industries over the last few years. In the era of Industry 4.0, most enterprises leverage automation by using Internet of Things (IoT) technology. Despite the numerous applications of blockchain across industries, significant latency in the consensus algorithm in blockchain hinders its adoption among businesses using IoT technology. A number of studies have addressed the obstacles of transaction processing performance and system scalability, mostly based on a public blockchain. However, the approaches still involve centralised components and thus fail to fully utilise decentralisation. Here, a private blockchain-based IoT data integration platform is proposed to achieve data integrity and system scalability. Along with a lightweight IoT gateway, instead of any other additional middleware, the process and the system configuration are streamlined. By using Hyperledger Fabric, the design is validated, and the proposed architecture outperforms other conventional models in IoT data processing. Thus, decentralisation in IoT environments is achieved.</p>


Sign in / Sign up

Export Citation Format

Share Document