scholarly journals A Threat Model for Security Specification in Security Evaluation by ISO/IEC 19791

2013 ◽  
Vol 21 (4) ◽  
pp. 624-631
Author(s):  
Guillermo Horacio Ramirez Caceres ◽  
Yoshimi Teshigawara
2015 ◽  
Vol 23 (1) ◽  
pp. 73-101 ◽  
Author(s):  
Eugene Ferry ◽  
John O Raw ◽  
Kevin Curran

Purpose – The interoperability of cloud data between web applications and mobile devices has vastly improved over recent years. The popularity of social media, smartphones and cloud-based web services have contributed to the level of integration that can be achieved between applications. This paper investigates the potential security issues of OAuth, an authorisation framework for granting third-party applications revocable access to user data. OAuth has rapidly become an interim de facto standard for protecting access to web API data. Vendors have implemented OAuth before the open standard was officially published. To evaluate whether the OAuth 2.0 specification is truly ready for industry application, an entire OAuth client server environment was developed and validated against the speciation threat model. The research also included the analysis of the security features of several popular OAuth integrated websites and comparing those to the threat model. High-impacting exploits leading to account hijacking were identified with a number of major online publications. It is hypothesised that the OAuth 2.0 specification can be a secure authorisation mechanism when implemented correctly. Design/methodology/approach – To analyse the security of OAuth implementations in industry a list of the 50 most popular websites in Ireland was retrieved from the statistical website Alexa (Noureddine and Bashroush, 2011). Each site was analysed to identify if it utilised OAuth. Out of the 50 sites, 21 were identified with OAuth support. Each vulnerability in the threat model was then tested against each OAuth-enabled site. To test the robustness of the OAuth framework, an entire OAuth environment was required. The proposed solution would compose of three parts: a client application, an authorisation server and a resource server. The client application needed to consume OAuth-enabled services. The authorisation server had to manage access to the resource server. The resource server had to expose data from the database based on the authorisation the user would be given from the authorisation server. It was decided that the client application would consume emails from Google’s Gmail API. The authorisation and resource server were modelled around a basic task-tracking web application. The client application would also consume task data from the developed resource server. The client application would also support Single Sign On for Google and Facebook, as well as a developed identity provider “MyTasks”. The authorisation server delegated authorisation to the client application and stored cryptography information for each access grant. The resource server validated the supplied access token via public cryptography and returned the requested data. Findings – Two sites out of the 21 were found to be susceptible to some form of attack, meaning that 10.5 per cent were vulnerable. In total, 18 per cent of the world’s 50 most popular sites were in the list of 21 OAuth-enabled sites. The OAuth 2.0 specification is still very much in its infancy, but when implemented correctly, it can provide a relatively secure and interoperable authentication delegation mechanism. The IETF are currently addressing issues and expansions in their working drafts. Once a strict level of conformity is achieved between vendors and vulnerabilities are mitigated, it is likely that the framework will change the way we access data on the web and other devices. Originality/value – OAuth is flexible, in that it offers extensions to support varying situations and existing technologies. A disadvantage of this flexibility is that new extensions typically bring new security exploits. Members of the IETF OAuth Working Group are constantly refining the draft specifications and are identifying new threats to the expanding functionality. OAuth provides a flexible authentication mechanism to protect and delegate access to APIs. It solves the password re-use across multiple accounts problem and stops the user from having to disclose their credentials to third parties. Filtering access to information by scope and giving the user the option to revoke access at any point gives the user control of their data. OAuth does raise security concerns, such as defying phishing education, but there are always going to be security issues with any authentication technology. Although several high impacting vulnerabilities were identified in industry, the developed solution proves the predicted hypothesis that a secure OAuth environment can be built when implemented correctly. Developers must conform to the defined specification and are responsible for validating their implementation against the given threat model. OAuth is an evolving authorisation framework. It is still in its infancy, and much work needs to be done in the specification to achieve stricter validation and vendor conformity. Vendor implementations need to become better aligned in order to provider a rich and truly interoperable authorisation mechanism. Once these issues are resolved, OAuth will be on track for becoming the definitive authentication standard on the web.


Author(s):  
Jin HOKI ◽  
Kosei SAKAMOTO ◽  
Fukang LIU ◽  
Kazuhiko MINEMATSU ◽  
Takanori ISOBE

2018 ◽  
Author(s):  
Tuba Kiyan ◽  
Heiko Lohrke ◽  
Christian Boit

Abstract This paper compares the three major semi-invasive optical approaches, Photon Emission (PE), Thermal Laser Stimulation (TLS) and Electro-Optical Frequency Mapping (EOFM) for contactless static random access memory (SRAM) content read-out on a commercial microcontroller. Advantages and disadvantages of these techniques are evaluated by applying those techniques on a 1 KB SRAM in an MSP430 microcontroller. It is demonstrated that successful read out depends strongly on the core voltage parameters for each technique. For PE, better SNR and shorter integration time are to be achieved by using the highest nominal core voltage. In TLS measurements, the core voltage needs to be externally applied via a current amplifier with a bias voltage slightly above nominal. EOFM can use nominal core voltages again; however, a modulation needs to be applied. The amplitude of the modulated supply voltage signal has a strong effect on the quality of the signal. Semi-invasive read out of the memory content is necessary in order to remotely understand the organization of memory, which finds applications in hardware and software security evaluation, reverse engineering, defect localization, failure analysis, chip testing and debugging.


2010 ◽  
Vol 12 (2) ◽  
pp. 301-308 ◽  
Author(s):  
Caige SUN ◽  
Kaiwen ZHONG ◽  
Xulong LIU ◽  
Liang XIE

Sign in / Sign up

Export Citation Format

Share Document