Practical Approach in Verification of Security Systems Using Satisfiability Modulo Theories

2020 ◽  
Author(s):  
Agnieszka M Zbrzezny ◽  
Sabina Szymoniak ◽  
Miroslaw Kurkowski

Abstract The paper presents a novel method for the verification of security protocols’ (SPs)time properties. The new method uses a translation to satisfiability modulo theories (SMT) problem. In our approach, we model protocol users’ behaviours using networks of synchronized timed automata. Suitably specified correctness properties are defined as a reachability property of some chosen states in an automata network. Then, the network of timed automata and the property are translated to an SMT problem and checked using an SMT-solver and a BMC algorithm. We consider the most important time properties of protocol executions using specially constructed time conditions. The new method was also implemented and experimentally evaluated for six well-known SPs. We also compared our new SMT-based technique with the corresponding SAT-based approach.

10.29007/x7b4 ◽  
2018 ◽  
Author(s):  
Nikolaj Bjorner

Modern Satisfiability Modulo Theories (SMT)solvers are fundamental to many programanalysis, verification, design and testing tools. They are a goodfit for the domain of software and hardware engineering becausethey support many domains that are commonly used by the tools.The meaning of domains are captured by theories that can beaxiomatized or supported by efficient <i>theory solvers</i>.Nevertheless, not all domains are handled by all solvers andmany domains and theories will never be native to any solver.We here explore different theories that extend MicrosoftResearch's SMT solver Z3's basicsupport. Some can be directly encoded or axiomatized,others make use of user theory plug-ins.Plug-ins are a powerful way for tools to supply their custom domains.


2008 ◽  
Vol 2008 ◽  
pp. 1-6 ◽  
Author(s):  
Tng C. H. John ◽  
Edmond C. Prakash ◽  
Narendra S. Chaudhari

This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002), in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006). We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.


2014 ◽  
Author(s):  
Dimitris Nikoloudis ◽  
Jim E. Pitts ◽  
José W. Saldanha

The accurate prediction of the conformation of Complementarity-Determining Regions (CDRs) is important in modelling antibodies for protein engineering applications. Specifically, the Canonical paradigm has proved successful in predicting the CDR conformation in antibody variable regions. It relies on canonical templates which detail allowed residues at key positions in the variable region framework or in the CDR itself for 5 of the 6 CDRs. While no templates have as yet been defined for the hypervariable CDR-H3, instead, reliable sequence rules have been devised for predicting the base of the CDR-H3 loop. Here a new method termed Disjoint Combinations Profiling (DCP) is presented, which contributes a considerable advance in the prediction of CDR conformations. This novel method is explained and compared with canonical templates and sequence rules in a 3-way blind prediction. DCP achieved 93% accuracy over 951 blind predictions and showed an improvement in cumulative accuracy compared to predictions with canonical templates or sequence-rules. In addition to its overall improvement in prediction accuracy, it is suggested that DCP is open to better implementations in the future and that it can improve as more antibody structures are deposited in the databank. In contrast, it is argued that canonical templates and sequence rules may have reached their peak.


2003 ◽  
Vol 10 (49) ◽  
Author(s):  
Marius Mikucionis ◽  
Kim G. Larsen ◽  
Brian Nielsen

In this paper we present a framework, an algorithm and a new tool for online testing of real-time systems based on symbolic techniques used in UPPAAL model checker. We extend UPPAAL timed automata network model to a test specification which is used to generate test primitives and to check the correctness of system responses including the timing aspects. We use timed trace inclusion as a conformance relation between system and specification to draw a test verdict. The test generation and execution algorithm is implemented as an extension to UPPAAL and experiments carried out to examine the correctness and performance of the tool. The experiment results are promising.


Author(s):  
John Robinson P. ◽  
Henry Amirtharaj E. C.

Various attempts are made by researchers on the study of vagueness of data through Intuitionistic Fuzzy sets and Vague sets, and also it is shown that Vague sets are Intuitionistic Fuzzy sets. However, there are algebraic and graphical differences between Vague sets and Intuitionistic Fuzzy sets. In this chapter, an attempt is made to define the correlation coefficient of Interval Vague sets lying in the interval [0,1], and a new method for computing the correlation coefficient of interval Vague sets lying in the interval [-1,1] using a-cuts over the vague degrees through statistical confidence intervals is also presented by an example. The new method proposed in this work produces a correlation coefficient in the form of an interval. The proposed method produces a correlation coefficient in the form of an interval from a trapezoidal shaped fuzzy number derived from the vague degrees. This chapter also aims to develop a new method based on the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) to solve MADM problems for Interval Vague Sets (IVSs). A TOPSIS algorithm is constructed on the basis of the concepts of the relative-closeness coefficient computed from the correlation coefficient of IVSs. This novel method also identifies the positive and negative ideal solutions using the correlation coefficient of IVSs. A numerical illustration explains the proposed algorithms and comparisons are made with some existing methods.


Fuzzy Systems ◽  
2017 ◽  
pp. 1110-1149
Author(s):  
John Robinson P. ◽  
Henry Amirtharaj E. C.

Various attempts are made by researchers on the study of vagueness of data through Intuitionistic Fuzzy sets and Vague sets, and also it is shown that Vague sets are Intuitionistic Fuzzy sets. However, there are algebraic and graphical differences between Vague sets and Intuitionistic Fuzzy sets. In this chapter, an attempt is made to define the correlation coefficient of Interval Vague sets lying in the interval [0,1], and a new method for computing the correlation coefficient of interval Vague sets lying in the interval [-1,1] using a-cuts over the vague degrees through statistical confidence intervals is also presented by an example. The new method proposed in this work produces a correlation coefficient in the form of an interval. The proposed method produces a correlation coefficient in the form of an interval from a trapezoidal shaped fuzzy number derived from the vague degrees. This chapter also aims to develop a new method based on the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) to solve MADM problems for Interval Vague Sets (IVSs). A TOPSIS algorithm is constructed on the basis of the concepts of the relative-closeness coefficient computed from the correlation coefficient of IVSs. This novel method also identifies the positive and negative ideal solutions using the correlation coefficient of IVSs. A numerical illustration explains the proposed algorithms and comparisons are made with some existing methods.


2005 ◽  
Vol 13 (5) ◽  
pp. 525-527
Author(s):  
Cheng Yiyun ◽  
Cui Ronghui ◽  
He Pingsheng

This study presents a new method of preparing Mg(OH)2/epoxy resin nanocomposites. An epoxy resin micro-emulsion is taken as a micro-reactor for the formation of Mg(OH)2 nano-crystals. After the reaction, the collected epoxy proved to be a composite with embedded nano-Mg(OH)2. Transmission electron microscopy (TEM) indicated that the Mg(OH)2 nano-crystals were dispersed uniformly in cured epoxy resin matrix.


Energies ◽  
2020 ◽  
Vol 13 (3) ◽  
pp. 668
Author(s):  
Jie Jian ◽  
Lide Wang ◽  
Huang Chen ◽  
Xiaobo Nie

The time-triggered communication paradigm is a cost-efficient way to meet the real-time requirements of cyber-physical systems. It is a non-deterministic polynomial NP-complete problem for multi-hop networks and non-strictly periodic traffic. A two-level scheduling approach is proposed to simplify the complexity during optimization. In the first level, a fuzzy-controlled quantum-behaved particle swarm optimization (FQPSO) algorithm is proposed to optimize the scheduling performance by assigning time-triggered frame instances to the basic periods of each link. In order to prevent population from high aggregation, a random mutation mechanism is used to disturb particles at the aggregation point and enhance the diversity at later stages. Fuzzy logic is introduced and well designed to realize a dynamic adaptive adjustment of the contraction–expansion coefficient and mutation rate in FQPSO. In the second level, we use an improved Satisfiability Modulo Theories (SMT) scheduling algorithm to solve the collision-free and temporal constraints. A schedulability ranking method is proposed to accelerate the computation of the SMT-based incremental scheduler. Our approach can co-optimize the jitter and load balance of communication for an off-line schedule. The experiments show that the proposed approach can improve the performance of the scheduling table, reduce the optimization time, and reserve space for incremental messages.


2014 ◽  
Vol 644-650 ◽  
pp. 1429-1432
Author(s):  
Tian Yu Sun ◽  
Yan Bo Liu ◽  
Tian Su Wei ◽  
Fu Cheng Yin

There is not any calibration specification or verification regulation for the milliampere-second meter, a very important instrument for preventive maintenance and calibrating the X-ray machine [1][2], is used to measure the value of the product of tube current and exposure time. To solve the problem that the traditional method can not offer the same standard value in different times, this paper presents a method for calibrating the milliampere-second meter by using a signal generator and a digital meter. It shows that the new method has good repeatability and stability.


2011 ◽  
Vol 19 (2) ◽  
Author(s):  
A. Roy ◽  
S. Mitra ◽  
R. Agrawal

AbstractManipulation in image has been in practice since centuries. These manipulated images are intended to alter facts — facts of ethics, morality, politics, sex, celebrity or chaos. Image forensic science is used to detect these manipulations in a digital image. There are several standard ways to analyze an image for manipulation. Each one has some limitation. Also very rarely any method tried to capitalize on the way image was taken by the camera. We propose a new method that is based on light and its shade as light and shade are the fundamental input resources that may carry all the information of the image. The proposed method measures the direction of light source and uses the light based technique for identification of any intentional partial manipulation in the said digital image. The method is tested for known manipulated images to correctly identify the light sources. The light source of an image is measured in terms of angle. The experimental results show the robustness of the methodology.


Sign in / Sign up

Export Citation Format

Share Document