Secure Data Dissemination

2008 ◽  
pp. 1839-1864
Author(s):  
Elisa Bertino ◽  
Barbara Carminati ◽  
Elena Ferrari

In this chapter, we present the main security issues related to the selective dissemination of information (SDI system). More precisely, after provided an overview of the work carried out in this field, we have focused on the security properties that a secure SDI system (SSDI system) must satisfy and on some of the strategies and mechanisms that can be used to ensure them.  Indeed, since XML is the today emerging standard for data exchange over the Web, we have casted our attention on Secure and Selective XML data dissemination (SSXD).  As a result, we have presented a SSXD system providing a comprehensive solution to XML documents. In the proposed chapter, we also consider innovative architecture for the data dissemination, by suggesting a SSXD system exploiting the third-party architecture, since this architecture is receiving growing attention as a new paradigm for data dissemination over the web. In a third-party architecture, there is a distinction between the  Owner  and the Publisher of information. The Owner is the producer of the information, whereas Publishers are responsible for managing (a portion of) the Owner information and for answering user queries. A relevant issue in this architecture is how the Owner can ensure a secure dissemination of its data, even if the data are managed by a third-party. Such scenario requires a redefinition of dissemination mechanisms developed for the traditional SSXD system. Indeed, the traditional techniques cannot be exploited in a third party scenario. For instance, let us consider the traditional digital signature techniques, used to ensure data integrity and authenticity. In a third party scenario, that is, a scenario where a third party may prune some of the nodes of the original document based on user queries, the traditional digital signature is not applicable, since its correctness is based on the requirement that the signing and verification process are performed on exactly the same bits.

Author(s):  
Elisa Berino ◽  
Barbara Carminati ◽  
Elena Ferrari

In this chapter, we present the main security issues related to the selective dissemination of information (SDI system). More precisely, after provided an overview of the work carried out in this field, we have focused on the security properties that a secure SDI system (SSDI system) must satisfy and on some of the strategies and mechanisms that can be used to ensure them.  Indeed, since XML is the today emerging standard for data exchange over the Web, we have casted our attention on Secure and Selective XML data dissemination (SSXD).  As a result, we have presented a SSXD system providing a comprehensive solution to XML documents. In the proposed chapter, we also consider innovative architecture for the data dissemination, by suggesting a SSXD system exploiting the third-party architecture, since this architecture is receiving growing attention as a new paradigm for data dissemination over the web. In a third-party architecture, there is a distinction between the  Owner  and the Publisher of information. The Owner is the producer of the information, whereas Publishers are responsible for managing (a portion of) the Owner information and for answering user queries. A relevant issue in this architecture is how the Owner can ensure a secure dissemination of its data, even if the data are managed by a third-party. Such scenario requires a redefinition of dissemination mechanisms developed for the traditional SSXD system. Indeed, the traditional techniques cannot be exploited in a third party scenario. For instance, let us consider the traditional digital signature techniques, used to ensure data integrity and authenticity. In a third party scenario, that is, a scenario where a third party may prune some of the nodes of the original document based on user queries, the traditional digital signature is not applicable, since its correctness is based on the requirement that the signing and verification process are performed on exactly the same bits.


Cloud computing usage has been highly increased in past decades, and this has many features to effectively store, organize and process the data. The major concern in the cloud is that security is low and user requires verification process for the data integrity. Third Party Auditing (TPA) technique is applied to verify the integrity of data and various methods has been proposed in TPA for effective performance. The existing methods in TPA has the lower performance in communication overhead and execution time. In this research, Elliptic Curve Digital Signature (ECDS) is proposed to increase the efficiency of the TPA. Bilinear mapping technique is used for verification process without retrieving the data and this helps to reduce the communication overhead. The performance of ECDA is measured and compared with the existing method to analyze the performance.


2019 ◽  
Vol 11 (11) ◽  
pp. 225 ◽  
Author(s):  
Yuling Chen ◽  
Jinyi Guo ◽  
Changlou Li ◽  
Wei Ren

In the big data era, data are envisioned as critical resources with various values, e.g., business intelligence, management efficiency, and financial evaluations. Data sharing is always mandatory for value exchanges and profit promotion. Currently, certain big data markets have been created for facilitating data dissemination and coordinating data transaction, but we have to assume that such centralized management of data sharing must be trustworthy for data privacy and sharing fairness, which very likely imposes limitations such as joining admission, sharing efficiency, and extra costly commissions. To avoid these weaknesses, in this paper, we propose a blockchain-based fair data exchange scheme, called FaDe. FaDe can enable de-centralized data sharing in an autonomous manner, especially guaranteeing trade fairness, sharing efficiency, data privacy, and exchanging automation. A fairness protocol based on bit commitment is proposed. An algorithm based on blockchain script architecture for a smart contract, e.g., by a bitcoin virtual machine, is also proposed and implemented. Extensive analysis justifies that the proposed scheme can guarantee data exchanging without a trusted third party fairly, efficiently, and automatically.


Symmetry ◽  
2019 ◽  
Vol 11 (8) ◽  
pp. 969 ◽  
Author(s):  
Haowen Tan ◽  
Yuanzhao Song ◽  
Shichang Xuan ◽  
Sungbum Pan ◽  
Ilyong Chung

Nowadays, with rapid advancement of both the upcoming 5G architecture construction and emerging Internet of Things (IoT) scenarios, Device-to-Device (D2D) communication provides a novel paradigm for mobile networking. By facilitating continuous and high data rate services between physically proximate devices without interconnection with access points (AP) or service network (SN), spectral efficiency of the 5G network can be drastically increased. However, due to its inherent open wireless communicating features, security issues and privacy risks in D2D communication remain unsolved in spite of its benefits and prosperous future. Hence, proper D2D authentication mechanisms among the D2D entities are of great significance. Moreover, the increasing proliferation of smartphones enables seamlessly biometric sensor data collecting and processing, which highly correspond to the user’s unique behavioral characteristics. For the above consideration, we present a secure certificateless D2D authenticating mechanism intended for extreme scenarios in this paper. In the assumption, the key updating mechanism only requires a small modification in the SN side, while the decryption information of user equipment (UEs) remains constant as soon as the UEs are validated. Note that a symmetric key mechanism is adopted for the further data transmission. Additionally, the user activities data from smartphone sensors are analyzed for continuous authentication, which is periodically conducted after the initial validation. Note that in the assumed scenario, most of the UEs are out of the effective range of cellular networks. In this case, the UEs are capable of conducting data exchange without cellular connection. Security analysis demonstrates that the proposed scheme can provide adequate security properties as well as resistance to various attacks. Furthermore, performance analysis proves that the proposed scheme is efficient compared with state-of-the-art D2D authentication schemes.


2008 ◽  
pp. 1321-1338
Author(s):  
Elisa Bertino ◽  
Barbara Carminati ◽  
Elena Ferrari

UDDI registries are today the standard way of publishing information on Web services. They can be thought of as a structured repository of information that can be queried by clients to find the Web services that better fit their needs. Even if, at the beginning, UDDI has been mainly conceived as a public registry without specific facilities for security, today security issues are becoming more and more crucial, due to the fact that data published in UDDI registries may be highly strategic and sensitive. In this paper, we focus on authenticity issues by proposing a method based on Merkle Hash Trees, which does not require the party managing the UDDI to be trusted with authenticity. In the paper, besides giving all the details of the proposed solution, we show its benefit with standard digital signature techniques.


Author(s):  
Elisa Bertino ◽  
Barbara Carminati ◽  
Elena Ferrari

A Web service is a software system designed to support interoperable application-to-application interactions over the Internet. Web services are based on a set of XML standards, such as Web services description language (WSDL), simple object access protocol (SOAP) and universal description, discovery and integration (UDDI). A key role in the Web service architecture is played by UDDI registries, i.e., a structured repository of information that can be queried by clients to find the Web services that better fit their needs. Even if, at the beginning, UDDI has been mainly conceived as a public registry without specific facilities for security, today security issues are becoming more and more crucial, due to the fact that data published in UDDI registries may be highly strategic and sensitive. In this chapter, we focus on authenticity issues, by proposing a method based on Merkle hash trees, which does not require the party managing the UDDI to be trusted wrt authenticity. In the chapter, besides giving all the details of the proposed solution, we show its benefit wrt standard digital signature techniques.


2015 ◽  
Vol 23 (1) ◽  
pp. 73-101 ◽  
Author(s):  
Eugene Ferry ◽  
John O Raw ◽  
Kevin Curran

Purpose – The interoperability of cloud data between web applications and mobile devices has vastly improved over recent years. The popularity of social media, smartphones and cloud-based web services have contributed to the level of integration that can be achieved between applications. This paper investigates the potential security issues of OAuth, an authorisation framework for granting third-party applications revocable access to user data. OAuth has rapidly become an interim de facto standard for protecting access to web API data. Vendors have implemented OAuth before the open standard was officially published. To evaluate whether the OAuth 2.0 specification is truly ready for industry application, an entire OAuth client server environment was developed and validated against the speciation threat model. The research also included the analysis of the security features of several popular OAuth integrated websites and comparing those to the threat model. High-impacting exploits leading to account hijacking were identified with a number of major online publications. It is hypothesised that the OAuth 2.0 specification can be a secure authorisation mechanism when implemented correctly. Design/methodology/approach – To analyse the security of OAuth implementations in industry a list of the 50 most popular websites in Ireland was retrieved from the statistical website Alexa (Noureddine and Bashroush, 2011). Each site was analysed to identify if it utilised OAuth. Out of the 50 sites, 21 were identified with OAuth support. Each vulnerability in the threat model was then tested against each OAuth-enabled site. To test the robustness of the OAuth framework, an entire OAuth environment was required. The proposed solution would compose of three parts: a client application, an authorisation server and a resource server. The client application needed to consume OAuth-enabled services. The authorisation server had to manage access to the resource server. The resource server had to expose data from the database based on the authorisation the user would be given from the authorisation server. It was decided that the client application would consume emails from Google’s Gmail API. The authorisation and resource server were modelled around a basic task-tracking web application. The client application would also consume task data from the developed resource server. The client application would also support Single Sign On for Google and Facebook, as well as a developed identity provider “MyTasks”. The authorisation server delegated authorisation to the client application and stored cryptography information for each access grant. The resource server validated the supplied access token via public cryptography and returned the requested data. Findings – Two sites out of the 21 were found to be susceptible to some form of attack, meaning that 10.5 per cent were vulnerable. In total, 18 per cent of the world’s 50 most popular sites were in the list of 21 OAuth-enabled sites. The OAuth 2.0 specification is still very much in its infancy, but when implemented correctly, it can provide a relatively secure and interoperable authentication delegation mechanism. The IETF are currently addressing issues and expansions in their working drafts. Once a strict level of conformity is achieved between vendors and vulnerabilities are mitigated, it is likely that the framework will change the way we access data on the web and other devices. Originality/value – OAuth is flexible, in that it offers extensions to support varying situations and existing technologies. A disadvantage of this flexibility is that new extensions typically bring new security exploits. Members of the IETF OAuth Working Group are constantly refining the draft specifications and are identifying new threats to the expanding functionality. OAuth provides a flexible authentication mechanism to protect and delegate access to APIs. It solves the password re-use across multiple accounts problem and stops the user from having to disclose their credentials to third parties. Filtering access to information by scope and giving the user the option to revoke access at any point gives the user control of their data. OAuth does raise security concerns, such as defying phishing education, but there are always going to be security issues with any authentication technology. Although several high impacting vulnerabilities were identified in industry, the developed solution proves the predicted hypothesis that a secure OAuth environment can be built when implemented correctly. Developers must conform to the defined specification and are responsible for validating their implementation against the given threat model. OAuth is an evolving authorisation framework. It is still in its infancy, and much work needs to be done in the specification to achieve stricter validation and vendor conformity. Vendor implementations need to become better aligned in order to provider a rich and truly interoperable authorisation mechanism. Once these issues are resolved, OAuth will be on track for becoming the definitive authentication standard on the web.


Author(s):  
Zaki Ullah ◽  
Samiullah Khan

The world is growing very rapidly concerning technology. In the next-generation Internet, the existing architecture requires to be upgraded from Host-Centric Networking paradigm to Information-centric networking architecture. The unique aspect of information-centric networking is in-network cashing. Due to the system augmentation and In-network cashing technique, this novel system needs extremely high content security to ensure system integrity and maintenance. 5G network may be supported by the Information-Centric Network due to its high data transmission rate. In order to handle the serious security issues such as attack on confidentiality, authentication and integrity of the content, a Digital Signature based Access Control Mechanism in Information-Centric Network (DSAC) scheme is proposed to enhance security of ICN. Briefly, this new scheme uses Digital Signature, hash function, Trusted Third Party (TTP) and Proxy TTP. The client request for content, after receiving a request, the content provider generates and encrypts content with the digital signature and random value ‘k’ hash function and send it to TTP. After the signing process, the TTP sends the encryption hash key to Proxy TTP. In this proposed scheme authentication, confidentiality, the integrity aspects of the content security are improved.


2021 ◽  
Vol 2 ◽  
pp. 1-9
Author(s):  
Stanislav Dakov ◽  
Anna Malinova

E-commerce security is part of the Web security problems that arise in all business information systems that operate over the Internet. However, in e-commerce security, the dimensions of web security – secrecy, integrity, and availability-are focused on protecting the consumer’s and e-store site’s assets from unauthorized access, use, alteration, or destruction. The paper presents an overview of the recent security issues in e-commerce applications and the usual points the attacker can target, such as the client (data, session, identity); the client computer; the network connection between the client and the webserver; the web server; third party software vendors. Discussed are effective approaches and tools used to address different e-commerce security threats. Special attention is paid to Cross-Site Scripting (XSS), Cross-Site Request Forgery (CSRF), phishing attacks, SQL injection, Man-in-the-middle, bots, denial-of-service, encryption, firewalls, SSL digital signatures, security certificates, PCI compliance. The research outlines and suggests many security solutions and best practices.


Symmetry ◽  
2020 ◽  
Vol 12 (8) ◽  
pp. 1344
Author(s):  
Md Arif Hassan ◽  
Zarina Shukur ◽  
Mohammad Kamrul Hasan ◽  
Ahmed Salih Al-Khaleefa

Modern technology is turning into an essential element in the financial trade. We focus the emphasis of this review on the research on the E-wallet and online payment, which is an element of an electric payment system, to get the pattern of using this service. This research presents a review of 131 research articles published on electronic payment between 2010 and 2020 that uses a qualitative method of answering the research questions (RQ): RQ1: “What are the major security issues regarding using electronic payments”? and RQ2: “What security properties need to comply for secure electronic payments?” With the systematic literature review approach, the results show that interest in E-wallet and online payment has grown significantly during this period, and it was found that for the increasing uses of electronic payments, researchers are more focused on security issues. The results show that, to conquer the key gaps, electronic payment must have some protection properties, namely, availability, authorization, integrity, non-repudiation, authentication, and confidentiality. Nowadays, security problems in electronic payment are usually more demanding than the present security problems on the web. These findings can enable electric transaction providers to strengthen their security methods by boosting their security gaps, as required for relevant services.


Sign in / Sign up

Export Citation Format

Share Document