Empirical Evaluation of Test Driven Modeling

Author(s):  
Stefan Zugal ◽  
Cornelia Haisjackl ◽  
Jakob Pinggera ◽  
Barbara Weber

Declarative approaches to process modeling are regarded well suited for highly volatile environments as they provide a high degree of flexibility. However, problems in understanding and maintaining declarative process models impede their usage. To compensate for these shortcomings, Test Driven Modeling (TDM) has been proposed. This paper reports on an empirical investigation in which TDM is viewed from two different angles. First, the impact of TDM on communication is explored in a case study. Results indicate that domain experts are inclined to use test cases for communicating with the model builder (system analyst) and prefer them over the process model. The second part of the investigation, a controlled experiment, investigates the impact of TDM on process model maintenance. Data gathered in this experiment indicates that the adoption of test cases significantly lowers cognitive load and increases the perceived quality of changes.

Author(s):  
Maysaa hasan muflih BaniHani

The purpose of this study was to investigate effectiveness of female administration in the Administrative Empowerment at Hail University branches and its impediments, from the point of view of the faculty members. The researcher used the descriptive analytical approach. The questionnaire was the tool of study and the study sample consisted of (53) of female faculty members at Hail University branches during the first semester 2018- 2019. The results of this study found that the overall degree of administrative empowerment was obtained at a general average (3.93) i.e. high degree and at the dimensional level. Indeed, the work team term was in the first rank with an average of 4.09, and then it comes the delegation of authority with an average of (3.89). The communication term comes at the third level with an average of (3.88) and in the last rank, it comes the motivation of the staff with an average of (3.86) and all of them with a rating of (high). As regard the obstacles in the women administration, the administrative impediments obtained the first rank with an average of (3.87) followed by the personal constraints with an average of (3.76), and then the political impediments with an average of (3.47), and finally the social obstacles with an average of (2.61). For instance, the study showed that there were no significant differences due to the impact of experience years and the impact of scientific qualification. There were some recommendations according to the study results, which were to increase the effectiveness of faculty members and remove obstacles facing them.


Author(s):  
Janina Fengel

Business process modeling has become an accepted means for designing and describing business operations. However, due to dissimilar utilization of modeling languages and, even more importantly, the natural language for labeling model elements, models can differ. As a result, comparisons are a non-trivial task that is presently to be performed manually. Thereby, one of the major challenges is the alignment of the business semantics contained, which is an indispensable pre-requisite for structural comparisons. For easing this workload, the authors present a novel approach for aligning business process models semantically in an automated manner. Semantic matching is enabled through a combination of ontology matching and information linguistics processing techniques. This provides for a heuristic to support domain experts in identifying similarities or discrepancies.


2010 ◽  
Vol 21 (4) ◽  
pp. 14-34 ◽  
Author(s):  
Rong Liu ◽  
Frederick Y. Wu ◽  
Santhosh Kumaran

Much of the prior work in business process modeling is activity-centric. Recently, an information-centric approach has emerged, where a business process is modeled as the interacting lifecycles of business entities. The benefits of this approach are documented in a number of case studies. In this paper, the authors formalize the information-centric approach and derive the relationships between the two approaches. The authors formally define the notion of a business entity, provide an algorithm to transform an activity-centric model into an information-centric process model, and demonstrate the equivalence between these two models. Further, they show the value of transforming from the activity-centric paradigm to the information-centric paradigm in business process componentization and Service-Oriented Architecture design and also provide an empirical evaluation.


2021 ◽  
Author(s):  
Anna Torrens-Burton ◽  
Silvia Goss ◽  
Eileen Sutton ◽  
Kali Barawi ◽  
Mirella Longo ◽  
...  

ABSTRACTThe COVID-19 pandemic has been a devastating, mass bereavement event characterised by sudden unexpected deaths and high levels of disruption to end-of-life, grieving and coping processes, as well as social life more broadly. We analysed qualitative free-text data from two independent UK-wide online surveys to describe in depth the experiences of 881 people bereaved during the Covid-19 pandemic using. We analysed the data in two phases, conducting an inductive thematic analysis and then applying Stroebe and Schut’s Dual Process Model (DPM) (1999; 2010) as an analytic lens to further contextualise and interpret the data. The DPM identifies loss-oriented and restoration-oriented coping processes between which grieving people naturally oscillate. Loss-oriented coping involves coming to terms with the death and lost relationship, while restoration-oriented coping involves adapting to new ways of life. We identified six main themes: troubled deaths (guilt, anger and unanswered questions); mourning, memorialisation and death administration; mass bereavement, the media and the ongoing threat of the pandemic; grieving and coping (alone and with others); work and employment; and support from the health and social care system. Examples of loss-oriented stressors included being unable to visit or say goodbye, the sudden and traumatic nature of many deaths, and restricted funeral and memorialisation practices. Associated reactions were feelings of guilt and anger, and problems accepting the death and starting to grieve. Examples of restoration-oriented stressors and reactions were stressful death-related administration and severely curtailed social networks, support systems and social/recreational activities, which impacted people’s ability to cope. Study results demonstrate the exceptionally difficult sets of experiences associated with pandemic bereavement, and the utility of the DPM for conceptualizing these additional challenges and their impacts on grieving. Our analysis builds and expands on previous use of the DPM (Stroebe and Schut, 2021) in explicating the impact of the pandemic on bereavement. We make recommendations for statutory, private and third sector organisations for improving the experiences of people bereaved during and following this and future pandemics.


Author(s):  
Mourad Badri ◽  
Aymen Kout ◽  
Linda Badri

This paper aims at investigating empirically the effect of aspect-oriented (AO) refactoring on the unit testability of classes in object-oriented software. The unit testability of classes has been addressed from the perspective of the unit testing effort, and particularly from the perspective of the unit test cases (TCs) construction. We investigated, in fact, different research questions: (1) the impact of AO refactoring on source code attributes (size, complexity, coupling, cohesion and inheritance), attributes that are mostly related to the unit testability of classes, (2) the impact of AO refactoring on unit test code attributes (size, assertions, invocations and data creation), attributes that are indicators of the effort involved to write the code of unit TCs, and (3) the relationships between the variations observed after AO refactoring in both source code and unit test code attributes. We used in the study different techniques: correlation analysis, statistical tests and linear regression. We performed an empirical evaluation using data collected from three well-known open source (Java) software systems (JHOTDRAW, HSQLBD and PETSTORE) that have been refactored using AO programming (AspectJ). Results suggest that: (1) overall, the effort involved in the construction of unit TCs of refactored classes has been reduced, (2) the variations of source code attributes have more impact on methods invocation between unit TCs, and finally (3) the variations of unit test code attributes are more influenced by the variation of the complexity of refactored classes compared to the other class attributes.


2021 ◽  
Author(s):  
Tom Mooney ◽  
Kelda Bratley ◽  
Amin Amin ◽  
Timothy Jadot

Abstract The use of conventional process simulators is commonplace for system design and is growing in use for online monitoring and optimization applications. While these simulators are extremely useful, additional value can be extracted by combining simulator predictions with field inputs from measurement devices such as flowmeters, pressure and temperature sensors. The statistical nature of inputs (e.g., measurement uncertainty) are typically not considered in the forward calculations performed by the simulators and so may lead to erroneous results if the actual raw measurement is in error or biased. A complementary modeling methodology is proposed to identify and correct measurement and process errors as an integral part of a robust simulation practice. The studied approach ensures best quality data for direct use in the process models and simulators for operations and process surveillance. From a design perspective, this approach also makes it possible to evaluate the impact of uncertainty of measured and unmeasured variables on CAPEX spend and optimize instrument / meter design. In this work, an extended statistical approach to process simulation is examined using Data Validation and Reconciliation, (DVR). The DVR methodology is compared to conventional non-statistical, deterministic process simulators. A key difference is that DVR uses any measured variable (inlet, outlet, or in between measurements), including its uncertainty, in the modelled process as an input, where only inlet measurement values are used by traditional simulators to estimate the values of all other measured and unmeasured variables. A walk through the DVR calculations and applications is done using several comparative case studies of a typical surface process facility. Examples are the simulation of commingled multistage oil and gas separation process, the validation of separators flowmeters and fluids samples, and the quantification of unmeasured variables along with their uncertainties. The studies demonstrate the added value from using redundancy from all available measurements in a process model based on the DVR method. Single points and data streaming field cases highlight the dependency and complementing roles of traditional simulators, and data validation provided by the DVR methodology; it is shown how robust measurement management strategies can be developed based on DVR's effective surveillance capabilities. Moreover, the cases demonstrate how DVR-based capex and opex improvements are derived from effective hardware selection using cost versus measurement precision trade-offs, soft measurements substitutes, and from condition-based maintenance strategies.


1993 ◽  
Vol 18 (1) ◽  
pp. 5-21 ◽  
Author(s):  
Norris Krueger

Shapero (1975, 1982) proposed an Intentionality-based process model of the entrepreneurial event. Entrepreneurial intentions should derive from feasibility and desirability perceptions plus a propensity to act on opportunities. Prior entrepreneurship-related experiences should influence entrepreneurial intentions indirectly through these perceptions. Path analyses found that feasibility and desirability perceptions and propensity to act each proved significant antecedents of entrepreneurial intentions. Perceived feasibility was significantly associated with the breadth of prior exposure. Perceived desirability was significantly associated with the positiveness of that prior exposure. Strong support was found for Shapero's model, arguing for further application of intentions-based process models of entrepreneurial activity.


2011 ◽  
Vol 22 (3) ◽  
pp. 1-23 ◽  
Author(s):  
Pnina Soffer ◽  
Maya Kaner

This paper investigates the need for complementing automated verification of business process models with a validity analysis performed by human analysts. As business processes become increasingly automated through process aware information systems, the quality of process design becomes crucial. Although verification of process models has gained much attention, their validation, relating to the reachability of the process goal, has hardly been addressed. The paper investigates the need for model validation both theoretically and empirically. The authors present a theoretical analysis, showing that process model verification and validation are complementary in nature, and an empirical evaluation of the effectiveness of validity criteria in validating a process model. The theoretical analysis, which relates to different aspects of process model quality, shows that process model verification and validation are complementary in nature. The empirical findings corroborate the effectiveness of validity criteria and indicate that a systematic criteria-supported validity analysis improves the identification of validity problems in process models.


10.2196/15374 ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. e15374 ◽  
Author(s):  
Michael Winter ◽  
Rüdiger Pryss ◽  
Thomas Probst ◽  
Manfred Reichert

Background The management and comprehension of business process models are of utmost importance for almost any enterprise. To foster the comprehension of such models, this paper has incorporated the idea of a serious game called Tales of Knightly Process. Objective This study aimed to investigate whether the serious game has a positive, immediate, and follow-up impact on process model comprehension. Methods A total of two studies with 81 and 64 participants each were conducted. Within the two studies, participants were assigned to a game group and a control group (ie, study 1), and a follow-up game group and a follow-up control group (ie, study 2). A total of four weeks separated study 1 and study 2. In both studies, participants had to answer ten comprehension questions on five different process models. Note that, in study 1, participants in the game group played the serious game before they answered the comprehension questions to evaluate the impact of the game on process model comprehension. Results In study 1, inferential statistics (analysis of variance) revealed that participants in the game group showed a better immediate performance compared to control group participants (P<.001). A Hedges g of 0.77 also indicated a medium to large effect size. In study 2, follow-up game group participants showed a better performance compared to participants from the follow-up control group (P=.01); here, a Hedges g of 0.82 implied a large effect size. Finally, in both studies, analyses indicated that complex process models are more difficult to comprehend (study 1: P<.001; study 2: P<.001). Conclusions Participants who played the serious game showed better performance in the comprehension of process models when comparing both studies.


2019 ◽  
Author(s):  
William Finnigan ◽  
Rhys Cutlan ◽  
Radka Snajdrova ◽  
Joseph P. Adams ◽  
Jennifer A. Littlechild ◽  
...  

AbstractMulti-step enzyme reactions offer considerable cost and productivity benefits. Process models offer a route to understanding the complexity of these reactions, and allow for their optimization. Despite the increasing prevalence of multi-step biotransformations, there are few examples of process models for enzyme reactions. From a toolbox of characterized enzyme parts, we demonstrate the construction of a process model for a seven enzyme, three step biotransformation using isolated enzymes. Enzymes for cofactor regeneration were employed to make thisin vitroreaction economical. Good modelling practice was critical in evaluating the impact of approximations and experimental error. We show that the use and validation of process models was instrumental in realizing and removing process bottlenecks, identifying divergent behavior, and for the optimization of the entire reaction using a genetic algorithm. We validated the optimized reaction to demonstrate that complex multi-step reactions with cofactor recycling involving at least seven enzymes can be reliably modelled and optimized.Significance statementThis study examines the challenge of modeling and optimizing multi-enzyme cascades. We detail the development, testing and optimization of a deterministic model of a three enzyme cascade with four cofactor regeneration enzymes. Significantly, the model could be easily used to predict the optimal concentrations of each enzyme in order to get maximum flux through the cascade. This prediction was strongly validated experimentally. The success of our model demonstrates that robust models of systems of at least seven enzymes are readily achievable. We highlight the importance of following good modeling practice to evaluate model quality and limitations. Examining deviations from expected behavior provided additional insight into the model and enzymes. This work provides a template for developing larger deterministic models of enzyme cascades.


Sign in / Sign up

Export Citation Format

Share Document