encrypt data
Recently Published Documents


TOTAL DOCUMENTS

66
(FIVE YEARS 42)

H-INDEX

2
(FIVE YEARS 1)

2022 ◽  
Vol 2022 ◽  
pp. 1-11
Author(s):  
Guofeng Zhang ◽  
Xiao Chen ◽  
Bin Feng ◽  
Xuchao Guo ◽  
Xia Hao ◽  
...  

Blockchain provides new technologies and ideas for the construction of agricultural product traceability system (APTS). However, if data is stored, supervised, and distributed on a multiparty equal blockchain, it will face major security risks, such as data privacy leakage, unauthorized access, and trust issues. How to protect the privacy of shared data has become a key factor restricting the implementation of this technology. We propose a secure and trusted agricultural product traceability system (BCST-APTS), which is supported by blockchain and CP-ABE encryption technology. It can set access control policies through data attributes and encrypt data on the blockchain. This can not only ensure the confidentiality of the data stored in the blockchain, but also set flexible access control policies for the data. In addition, a whole-chain attribute management infrastructure has been constructed, which can provide personalized attribute encryption services. Furthermore, a reencryption scheme based on ciphertext-policy attribute encryption (RE-CP-ABE) is proposed, which can meet the needs of efficient supervision and sharing of ciphertext data. Finally, the system architecture of the BCST-APTS is designed to successfully solve the problems of mutual trust, privacy protection, fine-grained, and personalized access control between all parties.


Electronics ◽  
2021 ◽  
Vol 10 (23) ◽  
pp. 3036
Author(s):  
German Cano-Quiveu ◽  
Paulino Ruiz-de-clavijo-Vazquez ◽  
Manuel J. Bellido ◽  
Jorge Juan-Chico ◽  
Julian Viejo-Cortes ◽  
...  

The Internet of Things (IoT) security is one of the most important issues developers have to face. Data tampering must be prevented in IoT devices and some or all of the confidentiality, integrity, and authenticity of sensible data files must be assured in most practical IoT applications, especially when data are stored in removable devices such as microSD cards, which is very common. Software solutions are usually applied, but their effectiveness is limited due to the reduced resources available in IoT systems. This paper introduces a hardware-based security framework for IoT devices (Embedded LUKS) similar to the Linux Unified Key Setup (LUKS) solution used in Linux systems to encrypt data partitions. Embedded LUKS (E-LUKS) extends the LUKS capabilities by adding integrity and authentication methods, in addition to the confidentiality already provided by LUKS. E-LUKS uses state-of-the-art encryption and hash algorithms such as PRESENT and SPONGENT. Both are recognized as adequate solutions for IoT devices being PRESENT incorporated in the ISO/IEC 29192-2:2019 for lightweight block ciphers. E-LUKS has been implemented in modern XC7Z020 FPGA chips, resulting in a smaller hardware footprint compared to previous LUKS hardware implementations, a footprint of about a 10% of these LUKS implementations, making E-LUKS a great alternative to provide Full Disk Encryption (FDE) alongside authentication to a wide range of IoT devices.


2021 ◽  
Author(s):  
vinodhini mani ◽  
Prakash M

Abstract Cloud computing poses a challenge to the healthcare infrastructure, as it affects privacy, confidentiality, and security rules concerning large binary objects such as x-rays and CT scan reports. However, health records are stored and accessed using an encryption hash which is stored in the interplanetary file system (IPFS), called a peer-to-peer system. But the patient’s data is sold, share for research purposes by their healthcare providers without their knowledge as it affects their privacy and security. In the healthcare industry today, customers face the issue of health record that lacks interoperability, resulting in difficulty aggregating and examining patient data. The objective of this research is to develop cybersecurity measurement approaches that ensure patient information security by protecting against cyber threats using blockchain technology based on healthcare IT. Consequently, this paper proposes an innovative solution to the problem, namely Patient-centric healthcare data management (PCHDM). It was built using IPFS, a permissioned distributed ledger system that uses Hyperledger Fabric, which stores health records, but only with the permission of the owner. A unique cryptographic public key encryption algorithm is used to encrypt IPFS data to build an electronic health record blockchain system. Our platform offers two types of solutions: (i) a solution that utilizes a database of hyper ledger fabric, which is an on-chain database, (ii) off-chain solutions which encrypt data and store it securely off-chain using IPFS. A robust blockchain solution for PCHDM will be created by encrypting the data stored in IPFS using appropriate public key cryptographic algorithms. To determine which blocks should be incorporated into the blockchain, the Byzantine Fault Tolerance is applied in the health chain architectural model. This system hosts smart contracts and application logic as well as smart contracts known as "chain code" via container technology. As part of this research, health record hashes were stored on the blockchain and the actual health data was stored off-chain in IPFS, which is the decentralized cloud storage system that achieves scalability. Due to the encryption of healthcare records with a hash, this model proves that unauthorized access is impossible because the records are more scalable, interoperable, and reliable. Stakeholders are more confident in collaborating and sharing their medical records with this model.


Electronics ◽  
2021 ◽  
Vol 10 (19) ◽  
pp. 2359
Author(s):  
Yingwen Chen ◽  
Bowen Hu ◽  
Hujie Yu ◽  
Zhimin Duan ◽  
Junxin Huang

The IoT devices deployed in various application scenarios will generate massive data with immeasurable value every day. These data often contain the user’s personal privacy information, so there is an imperative need to guarantee the reliability and security of IoT data sharing. We proposed a new encrypted data storing and sharing architecture by combining proxy re-encryption with blockchain technology. The consensus mechanism based on threshold proxy re-encryption eliminates dependence on the third-party central service providers. Multiple consensus nodes in the blockchain network act as proxy service nodes to re-encrypt data and combine converted ciphertext, and personal information will not be disclosed in the whole procedure. That eliminates the restrictions of using decentralized network to store and distribute private encrypted data safely. We implemented a lot of simulated experiments to evaluate the performance of the proposed framework. The results show that the proposed architecture can meet the extensive data access demands and increase a tolerable time latency. Our scheme is one of the essays to utilize the threshold proxy re-encryption and blockchain consensus algorithm to support IoT data sharing.


Author(s):  
Jutta Buschbom ◽  
Breda Zimkus ◽  
Andrew Bentley ◽  
Mariko Kageyama ◽  
Christopher Lyal ◽  
...  

Transdisciplinary and cross-cultural cooperation and collaboration are needed to build extended, densely interconnected information resources. These are the prerequisites for the successful implementation and execution of, for example, an ambitious monitoring framework accompanying the post-2020 Global Biodiversity Framework (GBF) of the Convention on Biological Diversity (CBD; SCBD 2021). Data infrastructures that meet the requirements and preferences of concerned communities can focus and attract community involvement, thereby promoting participatory decision making and the sharing of benefits. Community acceptance, in turn, drives the development of the data resources and data use. Earlier this year, the alliance for biodiversity knowledge (2021a) conducted forum-based consultations seeking community input on designing the next generation of digital specimen representations and consequently enhanced infrastructures. The multitudes of connections that arise from extending the digital specimen representations through linkages in all “directions” will form a powerful network of information for research and application. Yet, with the power of an extended, accessible data network comes the responsibility to protect sensitive information (e.g., the locations of threatened populations, culturally context-sensitive traditional knowledge, or businesses’ fundamental data and infrastructure assets). In addition, existing legislation regulates access and the fair and equitable sharing of benefits. Current negotiations on ‘Digital Sequence Information’ under the CBD suggest such obligations might increase and become more complex in the context of extensible information networks. For example, in the case of data and resources funded by taxpayers in the EU, such access should follow the general principle of being “as open as possible; as closed as is legally necessary” (cp. EC 2016). At the same time, the international regulations of the CBD Nagoya Protocol (SCBD 2011) need to be taken into account. Summarizing main outcomes from the consultation discussions in the forum thread “Meeting legal/regulatory, ethical and sensitive data obligations” (alliance for biodiversity knowledge 2021b), we propose a framework of ten guidelines and functionalities to achieve community building and drive application: Substantially contribute to the conservation and protection of biodiversity (cp. EC 2020). Use language that is CBD conformant. Show the importance of the digital and extensible specimen infrastructure for the continuing design and implementation of the post-2020 GBF, as well as the mobilisation and aggregation of data for its monitoring elements and indicators. Strive to openly publish as much data and metadata as possible online. Establish a powerful and well-thought-out layer of user and data access management, ensuring security of ‘sensitive data’. Encrypt data and metadata where necessary at the level of an individual specimen or digital object; provide access via digital cryptographic keys. Link obligations, rights and cultural information regarding use to the digital key (e.g. CARE principles (Carroll et al. 2020), Local Context-labels (Local Contexts 2021), licenses, permits, use and loan agreements, etc.). Implement a transactional system that records every transaction. Amplify workforce capacity across the digital realm, its work areas and workflows. Do no harm (EC 2020): Reduce the social and ecological footprint of the implementation, aiming for a long-term sustainable infrastructure across its life-cycle, including development, implementation and management stages. Substantially contribute to the conservation and protection of biodiversity (cp. EC 2020). Use language that is CBD conformant. Show the importance of the digital and extensible specimen infrastructure for the continuing design and implementation of the post-2020 GBF, as well as the mobilisation and aggregation of data for its monitoring elements and indicators. Strive to openly publish as much data and metadata as possible online. Establish a powerful and well-thought-out layer of user and data access management, ensuring security of ‘sensitive data’. Encrypt data and metadata where necessary at the level of an individual specimen or digital object; provide access via digital cryptographic keys. Link obligations, rights and cultural information regarding use to the digital key (e.g. CARE principles (Carroll et al. 2020), Local Context-labels (Local Contexts 2021), licenses, permits, use and loan agreements, etc.). Implement a transactional system that records every transaction. Amplify workforce capacity across the digital realm, its work areas and workflows. Do no harm (EC 2020): Reduce the social and ecological footprint of the implementation, aiming for a long-term sustainable infrastructure across its life-cycle, including development, implementation and management stages. Balancing the needs for open access, as well as protection, accountability and sustainability, the framework is designed to function as a robust interface between the (research) infrastructure implementing the extensible network of digital specimen representations, and the myriad of applications and operations in the real world. With the legal, ethical and data protection layers of the framework in place, the infrastructure will provide legal clarity and security for data providers and users, specifically in the context of access and benefit sharing under the CBD and its Nagoya Protocol. Forming layers of protection, the characteristics and functionalities of the framework are envisioned to be flexible and finely-grained, adjustable to fulfill the needs and preferences of a wide range of stakeholders and communities, while remaining focused on the protection and rights of the natural world. Respecting different value systems and national policies, the framework is expected to allow a divergence of views to coexist and balance differing interests. Thus, the infrastructure of the digital extensible specimen network is fair and equitable to many providers and users. This foundation has the capacity and potential to bring together the diverse global communities using, managing and protecting biodiversity.


2021 ◽  
Vol 23 (09) ◽  
pp. 1-13
Author(s):  
Jibin Joy ◽  
◽  
Dr. S. Devaraju ◽  

Data deduplication is a crucial technique for packing data and reducing duplication when transferring data. It is widely used in the cloud to restrict the usage of capacity memory and aids in transmission capacity sparing. Before redistributing data, an encryption mechanism is used to ensure the integrity of sensitive data during the deduplication process. The SHA algorithm is being used to save data in text format. To generate the security bits, padding is appended to the text. In de-duplication, it calculates the hash, i.e. hexadecimal number, string, and integer data. Hash-based de-duplication is the implementation of whole file hashing to the entire file. Text data’s hash values are considered to as feature properties. In contrast to traditional deduplication solutions, clients that transfer data to the cloud certify duplication inside the cloud data. In virtualization, both limiting primary memory size and memory blockage are considered important bottlenecks. Memory deduplication identifies pages with the same content and merges them into a single data file, reducing memory usage, memory parceling, and improving execution. In cloud storage, the MPT is used to deduplication so that it is used in single copies of the same data for different data owners. If any data users try to replicate the same data, it will be mapped and related to the archive data, implying that the data can’t be stored away. To ensure cloud data security, encryption techniques are used to encrypt data throughout deduplication procedures and prior to outsourcing cloud data.


Author(s):  
B. Murali Krishna ◽  
Chella Santhosh ◽  
Shruti Suman ◽  
SK. Sadhiya Shireen

A highly secure communication method is essential for end users for the exchange of information which is not interpreted by an intruder. Cryptography plays a crucial role in the current and upcoming digital worlds, for secure data transmission in wired and wireless networks. Asymmetric and symmetric cryptographic algorithms encrypt data against vulnerable attacks and transfer to authenticated users. Steganography is a method for providing secure information with the help of a carrier file (text, video, audio, image, etc.). This paper proposes Deoxyribonucleic Acid (DNA)-based asymmetric algorithm which is used to encrypt the patient’s secret information and its performance is compared with ElGamal, RSA and Diffie–Hellman (DH) cryptographic algorithms. The proposed asymmetric algorithm is applied to image steganography which is used for encrypting and concealing the patient’s secret information in a cover image. The proposed method consumes less hardware resources with improved latency. Dynamic Partial Reconfiguration (DPR) allows to transform a selective area rather than complete shutdown of the entire system during bitstream configuration. Cryptosystem with DPR is designed, synthesized in Xilinx Vivado and simulated in Vivado simulator. The design is targeted at Basys3, Nexys4 DDR and Zync-7000 all-programmable SOC (AP SoC) architectures and programmed with secure partial bit files to avoid vulnerable attacks in the channel.


2021 ◽  
Vol 10 (1) ◽  
pp. 53-61
Author(s):  
Rusydi Umar ◽  
Imam Riadi ◽  
Ridho Surya Kusuma

Ransomware viruses have become a dangerous threat increasing rapidly in recent years. One of the variants is Conti ransomware that can spread infection and encrypt data simultaneously. Attacks become a severe threat and damage the system, namely by encrypting data on the victim's computer, spreading it to other computers on the same computer network, and demanding a ransom. The working principle of this Ransomware acts by utilizing Registry Query, which covers all forms of behavior in accessing, deleting, creating, manipulating data, and communicating with C2 (Command and Control) servers. This study analyzes the Conti virus attack through a network forensic process based on network behavior logs. The research process consists of three stages, the first stage is simulating attacks on the host computer, the second stage is carrying network forensics by using live forensics methods, and the third stage is analysing malware by using statistical and dynamic analysis. The results of this study provide forensic data and virus behavior when running on RAM and computer networks so that the data obtained makes it possible to identify ransomware traffic on the network and deal with zero-day, especially ransomware threats. It is possible to do so because the analysis is an initial step in generating virus signatures based on network indicators.


2021 ◽  
Vol 11 (3) ◽  
pp. 239-246
Author(s):  
Rusydi Umar ◽  
Imam Riadi ◽  
Ridho Surya Kusuma

Sodinokibi Ransomware virus becomes a severe threat by targeting data encryption on a server, and this virus infection continues to spread to encrypt data on other computers. This study aims to mitigate by experiment with building a prevention system through computer network management. The mitigation process is carried out through static, dynamic, and Software-Defined Networking (SDN) analysis to prevent the impact of attacks through programmatic network management. SDN consists of two main components in its implementation, the Ryu controller and Open Virtual Switch (OVS). Result testing mitigation system on infected networks by crippling TCP internet protocol access can reduce virus spread by 17.13% and suppress Sodinokibi traffic logs by up to 73.97%. Based on the percentage data, SDN-based mitigation in this study is per the objectives to make it possible to mitigate Ransomware attacks on computer network traffic.


Sign in / Sign up

Export Citation Format

Share Document