scholarly journals Security-Focused Prototyping: A Natural Precursor to Secure Development

Author(s):  
Sam Attwood ◽  
Nana Onumah ◽  
Katie Paxton-Fear ◽  
Rupak Kharel

Secure development is a proactive approach to cyber security. Rather than building a technological solution and then securing it in retrospect, secure development strives to embed good security practices throughout the development process and thereby reduces risk. Unfortunately, evidence suggests secure development is complex, costly, and limited in practice. This article therefore introduces security-focused prototyping as a natural precursor to secure development that embeds security at the beginning of the development process, can be used to discover domain specific security requirements, and can help organisations navigate the complexity of secure development such that the resources and commitment it requires are better understood. Two case studies–one considering the creation of a bespoke web platform and the other considering the application layer of an Internet of Things system–verify the potential of the approach and its ability to discover domain specific security requirements in particular. Future work could build on this work by conducting case studies to further verify the potential of security-focused prototyping and even investigate its capacity to be used as a tool capable of reducing a broader, socio-technical, kind of risk.

Author(s):  
Basel Katt ◽  
Nishu Prasher

Security assurance is the confidence that a system meets its security requirements and is resilient against security vulnerabilities and failures. Existing approaches can be characterized as (1) qualitative in nature, (2) tend to achieve their goals manually to a large extent, (3) very costly, (4) development-process oriented, and finally, (3) treat all security requirements within one domain equally for all applications regardless of the context. In this chapter, the authors propose a security assurance framework and its assurance evaluation process. The framework and process depend on a quantitative security assurance metrics that were developed too. The proposed metric considers both the security requirements and vulnerability. Weight has been introduced to the security requirement metric to measure the importance of security requirements that need to be fulfilled. The framework with the proposed quantitative assurance metrics are evaluated and validated using two field case studies related to two operational REST APIs that belong to and are used by Statistics Norway.


Author(s):  
Keith M. Martin

This chapter considers eight applications of cryptography. These essentially act as case studies relating to all the previous material. For each application, we identify the security requirements, the application constraints, the choice of cryptography used, and the ways that the keys are managed. We begin with the SSL/TLS protocols used to secure Internet communications. We then examine the cryptography used in W-Fi networks, showing that early cryptographic design mistakes have subsequently been corrected. We then examine the evolving cryptography used to secure mobile telecommunications. This is followed by a discussion of the cryptography that underpins the security of payment card transactions. We look at the cryptography of video broadcasting and identity cards. We then examine the cryptography behind the Tor project, which use cryptography to support anonymous communication on the Internet. Finally, we examine the clever cryptographic design of Bitcoin, showing how use of cryptography can facilitate digital currency.


2021 ◽  
Vol 17 (2) ◽  
pp. 1-27
Author(s):  
Morteza Hosseini ◽  
Tinoosh Mohsenin

This article presents a low-power, programmable, domain-specific manycore accelerator, Binarized neural Network Manycore Accelerator (BiNMAC), which adopts and efficiently executes binary precision weight/activation neural network models. Such networks have compact models in which weights are constrained to only 1 bit and can be packed several in one memory entry that minimizes memory footprint to its finest. Packing weights also facilitates executing single instruction, multiple data with simple circuitry that allows maximizing performance and efficiency. The proposed BiNMAC has light-weight cores that support domain-specific instructions, and a router-based memory access architecture that helps with efficient implementation of layers in binary precision weight/activation neural networks of proper size. With only 3.73% and 1.98% area and average power overhead, respectively, novel instructions such as Combined Population-Count-XNOR , Patch-Select , and Bit-based Accumulation are added to the instruction set architecture of the BiNMAC, each of which replaces execution cycles of frequently used functions with 1 clock cycle that otherwise would have taken 54, 4, and 3 clock cycles, respectively. Additionally, customized logic is added to every core to transpose 16×16-bit blocks of memory on a bit-level basis, that expedites reshaping intermediate data to be well-aligned for bitwise operations. A 64-cluster architecture of the BiNMAC is fully placed and routed in 65-nm TSMC CMOS technology, where a single cluster occupies an area of 0.53 mm 2 with an average power of 232 mW at 1-GHz clock frequency and 1.1 V. The 64-cluster architecture takes 36.5 mm 2 area and, if fully exploited, consumes a total power of 16.4 W and can perform 1,360 Giga Operations Per Second (GOPS) while providing full programmability. To demonstrate its scalability, four binarized case studies including ResNet-20 and LeNet-5 for high-performance image classification, as well as a ConvNet and a multilayer perceptron for low-power physiological applications were implemented on BiNMAC. The implementation results indicate that the population-count instruction alone can expedite the performance by approximately 5×. When other new instructions are added to a RISC machine with existing population-count instruction, the performance is increased by 58% on average. To compare the performance of the BiNMAC with other commercial-off-the-shelf platforms, the case studies with their double-precision floating-point models are also implemented on the NVIDIA Jetson TX2 SoC (CPU+GPU). The results indicate that, within a margin of ∼2.1%--9.5% accuracy loss, BiNMAC on average outperforms the TX2 GPU by approximately 1.9× (or 7.5× with fabrication technology scaled) in energy consumption for image classification applications. On low power settings and within a margin of ∼3.7%--5.5% accuracy loss compared to ARM Cortex-A57 CPU implementation, BiNMAC is roughly ∼9.7×--17.2× (or 38.8×--68.8× with fabrication technology scaled) more energy efficient for physiological applications while meeting the application deadline.


Author(s):  
MIN DENG ◽  
R. E. K. STIREWALT ◽  
BETTY H. C. CHENG

Recently, there has been growing interest in formalizing UML, thereby enabling rigorous analysis of its many graphical diagrams. Two obstacles currently limit the adoption and use of UML formalizations in practice. First is the need to verify the consistency of artifacts under formalization. Second is the need to validate formalization approaches against domain-specific requirements. Techniques from the emerging field of requirements traceability hold promise for addressing these obstacles. This paper contributes a technique called retrieval by construction (RBC), which establishes traceability links between a UML model and a target model intended to denote its semantics under formalization. RBC provides an approach for structuring and representing the complex one-to-many links that are common between UML and target models under formalization. RBC also uses the notion of value identity in a novel way that enables the specification of the link-retrieval criteria using generative procedures. These procedures are a natural means for specifying UML formalizations. We have validated the RBC technique in a tool framework called UBanyan, written in C++. We applied the tool to three case studies, one of which was obtained from the industry. We have also assessed our results using the two well-known traceability metrics: precision and recall. Preliminary investigations suggest that RBC can be a useful traceability technique for validating and verifying UML formalizations.


Signals ◽  
2021 ◽  
Vol 2 (4) ◽  
pp. 803-819
Author(s):  
Nabin Chowdhury

As digital instrumentation in Nuclear Power Plants (NPPs) is becoming increasingly complex, both attack vectors and defensive strategies are evolving based on new technologies and vulnerabilities. Continued efforts have been made to develop a variety of measures for the cyber defense of these infrastructures, which often consist in adapting security measures previously developed for other critical infrastructure sectors according to the requirements of NPPs. That being said, due to the very recent development of these solutions, there is a lack of agreement or standardization when it comes to their adoption at an industrial level. To better understand the state of the art in NPP Cyber-Security (CS) measures, in this work, we conduct a Systematic Literature Review (SLR) to identify scientific papers discussing CS frameworks, standards, guidelines, best practices, and any additional CS protection measures for NPPs. From our literature analysis, it was evidenced that protecting the digital space in NPPs involves three main steps: (i) identification of critical digital assets; (ii) risk assessment and threat analysis; (iii) establishment of measures for NPP protection based on the defense-in-depth model. To ensure the CS protection of these infrastructures, a holistic defense-in-depth approach is suggested in order to avoid excessive granularity and lack of compatibility between different layers of protection. Additional research is needed to ensure that such a model is developed effectively and that it is based on the interdependencies of all security requirements of NPPs.


2012 ◽  
Vol 7 (5) ◽  
pp. 255-265
Author(s):  
Soo-Youl Park ◽  
Wook-Jin Choi ◽  
Bo-Heung Chung ◽  
Jeong-Nyeo Kim ◽  
Joo-Man Kim

Author(s):  
Sara de Freitas ◽  
Steve Jarvis

This chapter reviews some of the key research supporting the use of serious games for training in work contexts. The review indicates why serious games should be used to support training requirements, and in particular identifies “attitudinal change” in training as a key objective for deployment of serious games demonstrators. The chapter outlines a development approach for serious games and how it is being evaluated. Demonstrating this, the chapter proposes a game-based learning approach that integrates the use of a “four-dimensional framework”, outlines some key games principles, presents tools and techniques for supporting data collection and analysis, and considers a six-stage development process. The approach is then outlined in relation to a serious game for clinical staff concerned with infection control in hospitals and ambulances, which is being developed in a current research and development project. Survey findings from the target user group are presented and the use of tools and techniques explained in the context of the development process. The chapter proposes areas for future work and concludes that it is essential to use a specific development approach for supporting consistent game design, evaluation and efficacy for particular user groups.


Sign in / Sign up

Export Citation Format

Share Document