scholarly journals CheckVML: A Tool for Model Checking Visual Modeling Languages

Author(s):  
Ákos Schmidt ◽  
Dániel Varró
Author(s):  
Giancarlo Guizzardi ◽  
Luis Ferreira Pires ◽  
Marten van Sinderen

Author(s):  
Yong-Jun Shin ◽  
Eunho Cho ◽  
Doo-Hwan Bae

AbstractProactive adaptation, in which the adaptation for a system’s reliable goal achievement is performed by predicting changes in the environment, is considered as an effective alternative to reactive adaptation, in which adaptation is performed after observing changes. When predicting the environmental changes, the prediction may be uncertain, so it is necessary to verify and confirm an adaptation’s consequences before execution. To resolve the uncertainty, probabilistic model checking (PMC) has been utilized for verification of adaptation tactics’ effects on the goal of a self-adaptive system (SAS). However, PMC-based approaches have limitations on the state-explosion problem of complex SAS model verification and the modeling languages supported by the model checkers. In this paper, to overcome the limitations of the PMC-based approaches, we propose an efficient Proactive Adaptation approach based on STAtistical model checking (PASTA). Our approach allows SASs to mitigate the uncertainty of the future environment, faster than the PMC-based approach, by producing statistically sufficient samples for verification of adaptation tactics based on statistical model checking (SMC) algorithms. We provide algorithmic processes, a reference architecture, and an open-source implementation skeleton of PASTA for engineers to apply it for SAS development. We evaluate PASTA on two SASs using actual data and show that PASTA is efficient comparing to the PMC-based approach. We also provide a comparative analysis of the advantages and disadvantages of PMC- and SMC-based proactive adaptation to guide engineers’ decision-making for SAS development.


Author(s):  
Andreas Fellner ◽  
Mitra Tabaei Befrouei ◽  
Georg Weissenbacher

AbstractWe present a new method for model-based mutation-driven test case generation. Mutants are generated by making small syntactical modifications to the model or source code of the system under test. A test case kills a mutant if the behavior of the mutant deviates from the original system when running the test. In this work, we use hyperproperties—which allow to express relations between multiple executions—to formalize different notions of killing for both deterministic as well as non-deterministic models. The resulting hyperproperties are universal in the sense that they apply to arbitrary reactive models and mutants. Moreover, an off-the-shelf model checking tool for hyperproperties can be used to generate test cases. Furthermore, we propose solutions to overcome the limitations of current model checking tools via a model transformation and a bounded SMT encoding. We evaluate our approach on a number of models expressed in two different modeling languages by generating tests using a state-of-the-art mutation testing tool.


Sign in / Sign up

Export Citation Format

Share Document