scholarly journals Access Management of User and Cyber-Physical Device in DBaaS according to Indian IT Laws using Blockchain

2020 ◽  
Vol 21 (3) ◽  
pp. 407-424
Author(s):  
Gaurav Deep ◽  
Jagpreet Sidhu ◽  
Rajni Mohana

Computing on the cloud has changed the working of mankind in every manner, from storing to fetching every information on the cloud. To protect data on the cloud various access procedures and policies are used such as authentication and authorization. Authentication means the intended user is access data on the cloud and authorization means the user is accessing only that data for which he is allowed. The intended user now also includes Cyber-Physical Devices. Cyber-Physical Devices share data between them, fetch data from cloud. Cloud data is managed by employees of cloud Companies. Persons sitting on the cloud managing companies data is always doubtful as so many insider attacks have happened in the past affecting the company Image in the market. Data Related to Cyber-Physical Space may come under Insider attack. Companies managing user data are also liable to protect user data from any type of attack under various sections of the Indian IT act. Work in this paper has proposed blockchain as a possible solution to track the activities of employees managing cloud. Employee authentication and authorization are managed through the blockchain server. User authentication related data is stored in blockchain. Authorization rules are written in any Role/Attribute-based access language. These authorization rules stores the data related to user requests allowed access to data in blockchain. Proposed work will help cloud companies to have better control over their employee’s activities, thus help in preventing insider attack on User and Cyber-Physical Devices.

Author(s):  
Narander Kumar ◽  
Jitendra Kumar Samriya

Background: Cloud computing is a service that is being accelerating its growth in the field of information technology in recent years. Privacy and security are challenging issues for cloud users and providers. Obective: This work aims at ensuring secured validation of user and protects data during transmission for users in a public IoT-cloud environment. Existing security measures however fails by their single level of security, adaptability for large amount of data and reliability. Therefore, to overcome these issues and to achieve a better solution for vulnerable data. Method: The suggested method utilizes a secure transmission in cloud using key policy attribute based encryption (KPABE). Initially, user authentication is verified. Then the user data is encrypted with the help of KP-ABE algorithm. Finally, data validation and privacy preservation are done by Burrows-Abadi-Needham (BAN) logic. This verified, and shows that the proposed encryption is correct, secure and efficient to prevent unauthorized access and prevention of data leakage so that less chances of data/identity, theft of a user is the analysis and performed by KP-ABE, that is access control approach. Results: Here the method attains the maximum of 88.35% of validation accuracy with a minimum 8.78ms encryption time, which is better when, compared to the existing methods. The proposed mechanism is done by MATLAB. The performance of the implemented method is calculated based on the time of encryption and decryption, execution time and validation accuracy. Conclusion: Thus the proposed approach attains the high IoT-cloud data security and increases the speed for validation and transmission with high accuracy and used for cyber data science processing.


Sensors ◽  
2019 ◽  
Vol 19 (20) ◽  
pp. 4444 ◽  
Author(s):  
Gaurav Deep ◽  
Rajni Mohana ◽  
Anand Nayyar ◽  
P. Sanjeevikumar ◽  
Eklas Hossain

Cloud computing has made the software development process fast and flexible but on the other hand it has contributed to increasing security attacks. Employees who manage the data in cloud companies may face insider attack, affecting their reputation. They have the advantage of accessing the user data by interacting with the authentication mechanism. The primary aim of this research paper is to provide a novel secure authentication mechanism by using Blockchain technology for cloud databases. Blockchain makes it difficult to change user login credentials details in the user authentication process by an insider. The insider is not able to access the user authentication data due to the distributed ledger-based authentication scheme. Activity of insider can be traced and cannot be changed. Both insider and outsider user’s are authenticated using individual IDs and signatures. Furthermore, the user access control on the cloud database is also authenticated. The algorithm and theorem of the proposed mechanism have been given to demonstrate the applicability and correctness.The proposed mechanism is tested on the Scyther formal system tool against denial of service, impersonation, offline guessing, and no replay attacks. Scyther results show that the proposed methodology is secure cum robust.


Author(s):  
P.L. RINI ◽  
Y.GOLD ANAND. N

A major feature of cloud services is that user data are processed remotely among machines. But user fears of losing control of their own data, particularly financial and health data can becomes a significant barrier to wide adoption of cloud services in order to avoid this problem we provide a novel approach, namely Cloud Information Accountability (CIA) for clients. So that the authorized client can only access the data in the cloud. Data owner store data in the format of JAR format thus client access data only by the permission of data owner. To strengthen user’s control also provide a distributed audit mechanism by push and pull mode. Base64 encoding algorithm is used for encoding the JAR file in order to secure JAR file from attackers. Log maintained and send periodically to the data owner.


Author(s):  
M. P. Chitra ◽  
R. S. Ponmagal ◽  
N. P. G. Bhavani ◽  
V. Srividhya

Cloud computing has become popular among users in organizations and companies. Security and efficiency are the two major problems facing cloud service providers and their customers. Cloud data allocation facilities that allow groups of users to work together to access the shared data are the most standard and effective working styles in the enterprises. So, in spite of having advantages of scalability and flexibility, cloud storage service comes with confidential and security concerns. A direct method to defend the user data is to encrypt the data stored at the cloud. In this research work, a secure cloud model (SCM) that contains user authentication and data scheduling approach is scheduled. An innovative digital signature with chaotic secure hashing (DS-CS) is used for user authentication, followed by an enhanced work scheduling based on improved genetic algorithm to reduce the execution cost.


Author(s):  
Wolfgang Hommel ◽  
Michael Grabatin ◽  
Stefan Metzger ◽  
Daniela Pöhn

AbstractAccessing remote IT services through identity federations (IFs) is based on solid technical protocols such as the Security Assertion Markup Language (SAML) and OpenID Connect. However, reliable delegated user authentication and authorization also pose organizational challenges regarding the quality management of user data. Level of Assurance (LoA) concepts have been adapted and applied to IFs, but their inhomogeneous proliferation bears the risk of aggravating instead of simplifying the manual work steps. This is increased by the providing IT services for multiple or dynamically set up IFs. This article presents a novel LoA management approach that has been designed for a high degree of automation, adopts the approach for the dynamic metadata exchange by GÉANT-TrustBroker and exemplifies its usage.


Author(s):  
Steffen Kläbe ◽  
Kai-Uwe Sattler ◽  
Stephan Baumann

AbstractCloud data warehouse systems lower the barrier to access data analytics. These applications often lack a database administrator and integrate data from various sources, potentially leading to data not satisfying strict constraints. Automatic schema optimization in self-managing databases is difficult in these environments without prior data cleaning steps. In this paper, we focus on constraint discovery as a subtask of schema optimization. Perfect constraints might not exist in these unclean datasets due to a small set of values violating the constraints. Therefore, we introduce the concept of a generic PatchIndex structure, which handles exceptions to given constraints and enables database systems to define these approximate constraints. We apply the concept to the environment of distributed databases, providing parallel index creation approaches and optimization techniques for parallel queries using PatchIndexes. Furthermore, we describe heuristics for automatic discovery of PatchIndex candidate columns and prove the performance benefit of using PatchIndexes in our evaluation.


2017 ◽  
Vol 22 (S1) ◽  
pp. 1991-1999 ◽  
Author(s):  
Cheol-Joo Chae ◽  
Ki-Bong Kim ◽  
Han-Jin Cho

2017 ◽  
Vol 7 (1.1) ◽  
pp. 64 ◽  
Author(s):  
S. Renu ◽  
S.H. Krishna Veni

The Cloud computing services and security issues are growing exponentially with time. All the CSPs provide utmost security but the issues still exist. Number of technologies and methods are emerged and futile day by day. In order to overcome this situation, we have also proposed a data storage security system using a binary tree approach. Entire services of the binary tree are provided by a Trusted Third Party (TTP) .TTP is a government or reputed organization which facilitates to protect user data from unauthorized access and disclosure. The security services are designed and implemented by the TTP and are executed at the user side. Data classification, Data Encryption and Data Storage are the three vital stages of the security services. An automated file classifier classify unorganized files into four different categories such as Sensitive, Private, Protected and Public. Applied cryptographic techniques are used for data encryption. File splitting and multiple cloud storage techniques are used for data outsourcing which reduces security risks considerably. This technique offers  file protection even when the CSPs compromise. 


Author(s):  
Katarina Grolinger ◽  
Emna Mezghani ◽  
Miriam A. M. Capretz ◽  
Ernesto Exposito

Decision-making in disaster management requires information gathering, sharing, and integration by means of collaboration on a global scale and across governments, industries, and communities. Large volume of heterogeneous data is available; however, current data management solutions offer few or no integration capabilities and limited potential for collaboration. Moreover, recent advances in NoSQL, cloud computing, and Big Data open the door for new solutions in disaster data management. This chapter presents a Knowledge as a Service (KaaS) framework for disaster cloud data management (Disaster-CDM), with the objectives of facilitating information gathering and sharing; storing large amounts of disaster-related data; and facilitating search and supporting interoperability and integration. In the Disaster-CDM approach NoSQL data stores provide storage reliability and scalability while service-oriented architecture achieves flexibility and extensibility. The contribution of Disaster-CDM is demonstrated by integration capabilities, on examples of full-text search and querying services.


Sign in / Sign up

Export Citation Format

Share Document