DERIVING SYSTEM COMPLEXITY METRIC FROM EVENTS AND ITS VALIDATION

Author(s):  
SANGEETA SABHARWAL ◽  
SANDEEP K. SINGH ◽  
J. P. GUPTA

The event based paradigm has gathered momentum as witnessed by current efforts in areas ranging from event driven architectures, complex event processing, and business process management and modeling to grid computing, web services notifications, event stream processing and message-oriented middleware. The increasing popularity of event based systems has opened new challenging issues for them. One such issue is measuring complexity of these systems. A well-developed system should be maintainable, pluggable, scalable and less complex. In this paper, an event based approach is proposed to derive software metrics for measuring system complexity. Events taking place in a system are documented using the proposed event template. An event-flow model is constructed from event templates. The event-flow model of an event based system is represented as an event-flow graph. The proposed event-flow complexity metric for analysis model is derived from an event-flow graph. The metric has also been evaluated in terms of Weyuker's properties. Results of evaluation show that it satisfies 8 out of 9 Weyuker's properties. A prototype tool is also developed to automatically generate event interdependency matrices and compute absolute and relative complexity of an entire system. The proposed technique can be very effective especially for real time systems where lots of events take place.

2012 ◽  
Vol 32 (1) ◽  
pp. 21-32 ◽  
Author(s):  
Tiago Barbosa ◽  
Mário Costa ◽  
Jorge Morais ◽  
Marc Moreira ◽  
António Silva ◽  
...  

How Informative are the Vertical Buoyancy and the Prone Gliding Tests to Assess Young Swimmers' Hydrostatic and Hydrodynamic Profiles? The aim of this research was to develop a path-flow analysis model to highlight the relationships between buoyancy and prone gliding tests and some selected anthropometrical and biomechanical variables. Thirty-eight young male swimmers (12.97 ± 1.05 years old) with several competitive levels were evaluated. It were assessed the body mass, height, fat mass, body surface area, vertical buoyancy, prone gliding after wall push-off, stroke length, stroke frequency and velocity after a maximal 25 [m] swim. The confirmatory model included the body mass, height, fat mass, prone gliding test, stroke length, stroke frequency and velocity. All theoretical paths were verified except for the vertical buoyancy test that did not present any relationship with anthropometrical and biomechanical variables nor with the prone gliding test. The good-of-fit from the confirmatory path-flow model, assessed with the standardized root mean square residuals (SRMR), is considered as being close to the cut-off value, but even so not suitable of the theory (SRMR = 0.11). As a conclusion, vertical buoyancy and prone gliding tests are not the best techniques to assess the swimmer's hydrostatic and hydrodynamic profile, respectively.


2020 ◽  
Vol 2020 ◽  
pp. 1-20
Author(s):  
Cheng Xu ◽  
Hengjie Luo ◽  
Hong Bao ◽  
Pengfei Wang

The Internet of Vehicles (IoV) is an important artificial intelligence research field for intelligent transportation applications. Complex event interactions are important methods for data flow processing in a Vehicle to Everything (V2X) environment. Unlike the classic Internet of Things (IoT) systems, data streams in V2X include both temporal information and spatial information. Thus, effectively expressing and addressing spatiotemporal data interactions in the IoV is an urgent problem. To solve this problem, we propose a spatiotemporal event interaction model (STEIM). STEIM uses a time period and a raster map for its temporal model and spatial model, respectively. In this paper, first, we provide a spatiotemporal operator and a complete STEIM grammar that effectively expresses the spatiotemporal information of the spatiotemporal event flow in the V2X environment. Second, we describe the design of the operational semantics of the STEIM from the formal semantics. In addition, we provide a spatiotemporal event-stream processing algorithm that is based on the Petri net model. The STEIM establishes a mechanism for V2X event-stream temporal and spatial processing. Finally, the effectiveness of the STEIM-based system is demonstrated experimentally.


Author(s):  
G. Geetha Priya ◽  
B. Narendra Kumar Rao

GUI testing is a process of testing a software application or functionality of GUI. It is defined as an interface between user and software which provides an easy way to interact with the system. GUI plays an important role in software engineering. GUI-based application requires that a test case consists of sequences of user actions/events to be executed. Selenium is an open-source tool for testing GUI Application by executing test cases whether the GUI Application is working properly or not. In this, we present test script repair technique to repair test cases. Test script repair technique uses reverse engineering process for creating the test script. Test script repair consists of three stages; they are Ripping, Mapping and Repairing. In ripping stage, there are two relationships for representing event interaction of GUI Application. During ripping we know the location of each widget. In mapping stage,original GUI events are mapped to an event-flow graph (EFG). In repairing stage, Event flow graph uses repairing transformations and human input to modified script to repair the test cases, and synthesizes a new “repaired” test script. During this process, test script repair uses GUI objects for yielding a final test script that can be executed using selenium tool to validate the GUI Application. An experiment using selenium tool to test, test cases suggests that it is effective in that unusable test scripts are repaired. Annotations significantly reduced the human cost to repair test cases.


Author(s):  
Andrew D. Christian ◽  
Kamala J. Grasso ◽  
Warren P. Seering

Abstract This paper describes results obtained from an information-flow model of the design process. The model represents design projects as tasks to be accomplished and information to be exchanged. An event-based simulation uses computer agents to represent engineers working on the design project within a virtual office environment, exchanging information, and making decisions. The first of two validation studies presented in this paper uses corporate records and timesheet data to construct a model and compare simulation results with actual project results. The second study evaluates the ability of a manager unfamiliar with the software to set up a model of a design project and to gain useful insights by exercising the model.


Author(s):  
Owen Molloy ◽  
Claire Sheridan

Process performance improvement initiatives can be significantly enhanced in terms of performance measurement and diagnosis by real-time performance, quality and traceability information. Currently available Business Intelligence (BI) and Business Process Management (BPM) systems struggle to provide sufficiently lightweight or flexible solutions for the needs of process improvement projects. In addition, current process modelling languages such as XPDL and BPEL provide little or no support for the inclusion of detailed process performance metrics. This paper describes a generic framework using event-based process modelling to support the definition and inclusion of performance metrics and targets within process models, and the calculation of process performance metrics at user-defined intervals. The iWise implementation of this framework is an XML and Web services-based infrastructure that uses this event-based model for enhancing process visibility using real-time process metrics. Users can adjust alert thresholds on key process metrics in real-time. iWise also evaluates events for outlier or out-of-bounds events as they are processed. It uses an integrated rules engine, leveraging semantic technologies to write rules which are tested as process-related events occur in real-time.


Author(s):  
Andrea Antenucci ◽  
Giovanni Sansavini

In this article, adequacy and security assessments on the coupled operations of the electric and gas networks are performed. Extreme operating conditions and fault of components are considered as events that can impact the interdependent systems. The electric and gas networks are represented by an event-based direct current power flow model and by a transient one-dimensional mass flow model, respectively. Furthermore, the automations and safety strategies enforced by transmission system operators are represented within an original modelling approach. A quantitative analysis is performed with reference to the simplified energy infrastructures of Great Britain. Results highlight the contingencies which can jeopardize security and identify the components that are prone to fail and induce large gas pressure instabilities and loss of supply, and the locations in the gas grid that are susceptible to pressure violation. Moreover, a simulated 30% increase of the peak gas demand in 2015 is a limit for safe operations of the gas network, but the coupled systems are robust enough to avoid the spread of a cascading failure across networks. These results allow preventing critical operating conditions induced by the interaction between networks and can guide safety-based decisions on system reinforcements and the development of mitigating actions.


Sign in / Sign up

Export Citation Format

Share Document