RETRIEVAL BY CONSTRUCTION: A TRACEABILITY TECHNIQUE TO SUPPORT VERIFICATION AND VALIDATION OF UML FORMALIZATIONS

Author(s):  
MIN DENG ◽  
R. E. K. STIREWALT ◽  
BETTY H. C. CHENG

Recently, there has been growing interest in formalizing UML, thereby enabling rigorous analysis of its many graphical diagrams. Two obstacles currently limit the adoption and use of UML formalizations in practice. First is the need to verify the consistency of artifacts under formalization. Second is the need to validate formalization approaches against domain-specific requirements. Techniques from the emerging field of requirements traceability hold promise for addressing these obstacles. This paper contributes a technique called retrieval by construction (RBC), which establishes traceability links between a UML model and a target model intended to denote its semantics under formalization. RBC provides an approach for structuring and representing the complex one-to-many links that are common between UML and target models under formalization. RBC also uses the notion of value identity in a novel way that enables the specification of the link-retrieval criteria using generative procedures. These procedures are a natural means for specifying UML formalizations. We have validated the RBC technique in a tool framework called UBanyan, written in C++. We applied the tool to three case studies, one of which was obtained from the industry. We have also assessed our results using the two well-known traceability metrics: precision and recall. Preliminary investigations suggest that RBC can be a useful traceability technique for validating and verifying UML formalizations.

2021 ◽  
Vol 17 (2) ◽  
pp. 1-27
Author(s):  
Morteza Hosseini ◽  
Tinoosh Mohsenin

This article presents a low-power, programmable, domain-specific manycore accelerator, Binarized neural Network Manycore Accelerator (BiNMAC), which adopts and efficiently executes binary precision weight/activation neural network models. Such networks have compact models in which weights are constrained to only 1 bit and can be packed several in one memory entry that minimizes memory footprint to its finest. Packing weights also facilitates executing single instruction, multiple data with simple circuitry that allows maximizing performance and efficiency. The proposed BiNMAC has light-weight cores that support domain-specific instructions, and a router-based memory access architecture that helps with efficient implementation of layers in binary precision weight/activation neural networks of proper size. With only 3.73% and 1.98% area and average power overhead, respectively, novel instructions such as Combined Population-Count-XNOR , Patch-Select , and Bit-based Accumulation are added to the instruction set architecture of the BiNMAC, each of which replaces execution cycles of frequently used functions with 1 clock cycle that otherwise would have taken 54, 4, and 3 clock cycles, respectively. Additionally, customized logic is added to every core to transpose 16×16-bit blocks of memory on a bit-level basis, that expedites reshaping intermediate data to be well-aligned for bitwise operations. A 64-cluster architecture of the BiNMAC is fully placed and routed in 65-nm TSMC CMOS technology, where a single cluster occupies an area of 0.53 mm 2 with an average power of 232 mW at 1-GHz clock frequency and 1.1 V. The 64-cluster architecture takes 36.5 mm 2 area and, if fully exploited, consumes a total power of 16.4 W and can perform 1,360 Giga Operations Per Second (GOPS) while providing full programmability. To demonstrate its scalability, four binarized case studies including ResNet-20 and LeNet-5 for high-performance image classification, as well as a ConvNet and a multilayer perceptron for low-power physiological applications were implemented on BiNMAC. The implementation results indicate that the population-count instruction alone can expedite the performance by approximately 5×. When other new instructions are added to a RISC machine with existing population-count instruction, the performance is increased by 58% on average. To compare the performance of the BiNMAC with other commercial-off-the-shelf platforms, the case studies with their double-precision floating-point models are also implemented on the NVIDIA Jetson TX2 SoC (CPU+GPU). The results indicate that, within a margin of ∼2.1%--9.5% accuracy loss, BiNMAC on average outperforms the TX2 GPU by approximately 1.9× (or 7.5× with fabrication technology scaled) in energy consumption for image classification applications. On low power settings and within a margin of ∼3.7%--5.5% accuracy loss compared to ARM Cortex-A57 CPU implementation, BiNMAC is roughly ∼9.7×--17.2× (or 38.8×--68.8× with fabrication technology scaled) more energy efficient for physiological applications while meeting the application deadline.


Author(s):  
Anne Brüggemann-Klein ◽  
Tamer Demirel ◽  
Dennis Pagano ◽  
Andreas Tai

We report in this paper on a technique that we call reverse modeling. Reverse modeling starts with a conceptual model that is formulated in one or more generic modeling technologies such as UML or XML Schema. It abstracts from that model a custom, domain-specific meta-model and re-formulates the original model as an instance of the new meta-model. We demonstrate the value of reverse modeling with two case studies: One domain-specific meta-model facilitates design and user interface of a so-called instance generator for broadcasting productions metadata. Another one structures the translation of XML-encoded printer data for invoices into semantic XML. In a further section of this paper, we take a more general view and survey patterns that have evolved in the conceptual modeling of documents and data and that implicitly suggest sound meta-modeling constructs. Taken together, the two case studies and the survey of patterns in conceptual models bring us one step closer to our superior goal of developing a meta-meta-modeling facility whose instances are custom meta-models for conceptual document and data models. The research that is presented in this paper brings forward a core set of elementary constructors that a meta-meta-modeling facility should provide.


1997 ◽  
Vol 26 (522) ◽  
Author(s):  
Kjeld Høyer Mortensen

<p>The thesis consists of six <em>individual</em> papers, where the present paper contains the mandatory overview, while the remaining five papers are found separately from the overview. The five papers can roughly be divided into three areas of research, namely case studies, education, and extensions to the CPN method.</p><p>The primary purpose of the PhD thesis is to study the pragmatics, practical aspects, and intuition of CP-nets viewed as a formal method for describing and reasoning about concurrent systems. The perspective of pragmatics is our leitmotif, but at the same time in the context of CP-nets it is a kind of hypothesis of this thesis. This overview paper summarises the research conducted as an investigation of the hypothesis in the three areas of case studies, education, and extensions.</p><p>The provoking claim of pragmatics should not be underestimated. In the present overview of the thesis, the CPN method is compared with a representative selection of formal methods. The graphics and simplicity of semantics, yet generality and expressiveness of the language constructs, essentially makes CP-nets a viable and attractive alternative to other formal methods. Similar graphical formal methods, such as SDL and Statecharts, typically have significantly more complicated semantics, or are domain-specific languages.</p><p>research conducted in this thesis, opens a new complex of problems. Firstly, to get wider acceptance of CP-nets in industry, it is important to identify fruitful areas for the effective introduction of the CPN method. Secondly, it would be useful to identify a few extensions to the CPN method inspired by specific domains for easier adaption in industry. Thirdly, which analysis methods do future systems make use of?</p>


Author(s):  
Sam Attwood ◽  
Nana Onumah ◽  
Katie Paxton-Fear ◽  
Rupak Kharel

Secure development is a proactive approach to cyber security. Rather than building a technological solution and then securing it in retrospect, secure development strives to embed good security practices throughout the development process and thereby reduces risk. Unfortunately, evidence suggests secure development is complex, costly, and limited in practice. This article therefore introduces security-focused prototyping as a natural precursor to secure development that embeds security at the beginning of the development process, can be used to discover domain specific security requirements, and can help organisations navigate the complexity of secure development such that the resources and commitment it requires are better understood. Two case studies&ndash;one considering the creation of a bespoke web platform and the other considering the application layer of an Internet of Things system&ndash;verify the potential of the approach and its ability to discover domain specific security requirements in particular. Future work could build on this work by conducting case studies to further verify the potential of security-focused prototyping and even investigate its capacity to be used as a tool capable of reducing a broader, socio-technical, kind of risk.


2019 ◽  
Vol 6 ◽  
pp. 12-41
Author(s):  
Chris Dijkshoorn ◽  
Victor De Boer ◽  
Lora Aroyo ◽  
Guus Schreiber

With the increase of cultural heritage data published online, the usefulness of data in this open context hinges on the quality and diversity of descriptions of collection objects. In many cases, existing descriptions are not sufficient for retrieval and research tasks, resulting in the need for more specific annotations. However, eliciting such annotations is a challenge since it often requires domain-specific knowledge. Where crowdsourcing can be successfully used to execute simple annotation tasks, identifying people with the required expertise might prove troublesome for more complex and domain-specific tasks. Nichesourcing addresses this problem, by tapping into the expert knowledge available in niche communities. This paper presents Accurator, a methodology for conducting nichesourcing campaigns for cultural heritage institutions, by addressing communities, organizing events and tailoring a web-based annotation tool to a domain of choice. The contribution of this paper is fourfold: 1) a nichesourcing methodology, 2) an annotation tool for experts, 3) validation of the methodology in three case studies and 4) a dataset including the obtained annotations. The three domains of the case studies are birds on art, bible prints and fashion images. We compare the quality and quantity of obtained annotations in the three case studies, showing that the nichesourcing methodology in combination with the image annotation tool can be used to collect high-quality annotations in a variety of domains. A user evaluation indicates the tool is suited and usable for domain-specific annotation tasks.


Sign in / Sign up

Export Citation Format

Share Document