scholarly journals Business process analysis of programmer job role in software development using process mining

2022 ◽  
Vol 197 ◽  
pp. 701-708
Author(s):  
Rokhman Fauzi ◽  
Rachmadita Andreswari
Author(s):  
Julia Eggers ◽  
Andreas Hein ◽  
Markus Böhm ◽  
Helmut Krcmar

AbstractIn recent years, process mining has emerged as the leading big data technology for business process analysis. By extracting knowledge from event logs in information systems, process mining provides unprecedented transparency of business processes while being independent of the source system. However, despite its practical relevance, there is still a limited understanding of how organizations act upon the pervasive transparency created by process mining and how they leverage it to benefit from increased process awareness. Addressing this gap, this study conducts a multiple case study to explore how four organizations achieved increased process awareness by using process mining. Drawing on data from 24 semi-structured interviews and archival sources, this study reveals seven sociotechnical mechanisms based on process mining that enable organizations to create either standardized or shared awareness of sub-processes, end-to-end processes, and the firm’s process landscape. Thereby, this study contributes to research on business process management by revealing how process mining facilitates mechanisms that serve as a new, data-driven way of creating process awareness. In addition, the findings indicate that these mechanisms are influenced by the governance approach chosen to conduct process mining, i.e., a top-down or bottom-up driven implementation approach. Last, this study also points to the importance of balancing the social complications of increased process transparency and awareness. These results serve as a valuable starting point for practitioners to reflect on measures to increase organizational process awareness through process mining.


2019 ◽  
Vol 9 (16) ◽  
pp. 3313 ◽  
Author(s):  
Wu ◽  
He ◽  
Wang ◽  
Wen ◽  
Yu

To improve the service quality of complaint handling service in a manufacturing company, it is key to analyze the business processes. Process mining is quite a useful approach to diagnose complaint handling service process problems, such as bottlenecks and deviations. However, the current business process analysis methodologies based on process mining mainly focus on operational process analysis and neglect other system level analysis. In this study, we introduce the method of Accimap from the discipline of accident analysis to analyze the diagnosis results of process mining. By creating a complaint handling service process management Accimap model, the process mining results analysis can be carried out across different system levels. A case study in a big manufacturing company in China is implemented to verify our approach. In the case study, 42 complaint handling process management factors are identified and the complaint handling process management Accimap model is created. The testing results by Rasmussen’s seven predictions in his risk management framework show that Accimap method presents a systematic approach to analyze the process diagnosis results based on process mining.


Author(s):  
Ishak H. A. Meddah ◽  
Khaled Belkadi

MapReduce is a solution for the treatment of large data. With it we can analyze and process data. It does this by distributing the computation in a large set of machines. Process mining provides an important bridge between data mining and business process analysis. This technique allows for the extraction of information from event logs. Firstly, the chapter mines small patterns from log traces. Those patterns are the representation of the traces execution from a business process. The authors use existing techniques; the patterns are represented by finite state automaton; the final model is the combination of only two types of patterns that are represented by the regular expressions. Secondly, the authors compute these patterns in parallel, and then combine those patterns using MapReduce. They have two parties. The first is the Map Step. The authors mine patterns from execution traces. The second is the combination of these small patterns as reduce step. The results are promising; they show that the approach is scalable, general, and precise. It minimizes the execution time by the use of MapReduce.


2016 ◽  
Vol 3 (4) ◽  
pp. 21-31 ◽  
Author(s):  
Ishak Meddah ◽  
Belkadi Khaled

Process mining provides an important bridge between data mining and business process analysis, his techniques allow for extracting information from event logs. In general, there are two steps in process mining, correlation definition or discovery and then process inference or composition. Firstly, the authors' work consists to mine small patterns from a log traces of two applications; SKYPE, and VIBER, those patterns are the representation of the execution traces of a business process. In this step, the authors use existing techniques; The patterns are represented by finite state automaton or their regular expression; The final model is the combination of only two types of small patterns whom are represented by the regular expressions (ab)* and (ab*c)*. Secondly, the authors compute these patterns in parallel, and then combine those small patterns using the composition rules, they have two parties the first is the mine, they discover patterns from execution traces and the second is the combination of these small patterns. The patterns mining and the composition is illustrated by the automaton existing techniques. The Execution traces are the different actions effected by users in the SKYPE and VIBER. The results are general and precise. It minimizes the execution time and the loss of information.


Author(s):  
Ishak H.A. Meddah ◽  
Khaled Belkadi ◽  
Mohamed Amine Boudia

Hadoop MapReduce has arrived to solve the problem of treatment of big data, also the parallel treatment, with this framework the authors analyze, process a large size of data. It based for distributing the work in two big steps, the map and the reduce steps in a cluster or big set of machines. They apply the MapReduce framework to solve some problems in the domain of process mining how provides a bridge between data mining and business process analysis, this technique consists to mine lot of information from the process traces; In process mining, there are two steps, correlation definition and the process inference. The work consists in first time of mining patterns whom are the work flow of the process from execution traces, those patterns present the work or the history of each party of the process, the authors' small patterns are represented in this work by finite state automaton or their regular expression, the authors have only two patterns to facilitate the process, the general presentation of the process is the combination of the small mining patterns. The patterns are represented by the regular expressions (ab)* and (ab*c)*. Secondly, they compute the patterns, and combine them using the Hadoop MapReduce framework, in this work they have two general steps, first the Map step, they mine small patterns or small models from business process, and the second is the combination of models as reduce step. The authors use the business process of two web applications, the SKYPE, and VIBER applications. The general result shown that the parallel distributed process by using the Hadoop MapReduce framework is scalable, and minimizes the execution time.


Author(s):  
Ishak H. A. Meddah ◽  
Khaled Belkadi

Process mining provides an important bridge between data mining and business process analysis. This technique allows for the extraction of information from event logs. In general, there are two steps in process mining: correlation definition or discovery and then process inference or composition. Firstly, the authors mine small patterns from log traces of two applications; those patterns are the representation of the execution traces of a business process. In this step, the authors use existing techniques. The patterns are represented by finite state automaton or their regular expression. The final model is the combination of only two types of small patterns that are represented by the regular expressions (ab)* and (ab*c)*. Secondly, the authors compute these patterns in parallel and then combine those small patterns using the composition rules. They have two parties. The first is the mine, where the authors discover patterns from execution traces, and the second is the combination of these small patterns. The pattern mining and the composition is illustrated by the automaton existing techniques.


Author(s):  
Ishak H. A. Meddah ◽  
Khaled Belkadi

The treatment of large data is proving more difficult in different axes, but the arrival of the framework MapReduce is a solution of this problem. With it we can analyze and process vast amounts of data. It does this by distributing the computational work across a cluster of virtual servers running in a cloud or large set of machines while process mining provides an important bridge between data mining and business process analysis. The process mining techniques allow for extracting information from event logs. In general, there are two steps in process mining: correlation definition or discovery and process inference or composition. Firstly, the authors' work consists to mine small patterns from a log traces. Those patterns are the representation of the traces execution from a log file of a business process. In this step, they use existing techniques. The patterns are represented by finite state automaton or their regular expression. The final model is the combination of only two types of small patterns whom are represented by the regular expressions (ab)* and (ab*c)*. Secondly, the authors compute these patterns in parallel, and then combine those small patterns using the MapReduce framework. They have two parties: the first is the Map Step in which they mine patterns from execution traces; the second is the combination of these small patterns as reduce step. The authors' results are promising in that they show that their approach is scalable, general, and precise. It minimizes the execution time by the use of the MapReduce framework.


Sign in / Sign up

Export Citation Format

Share Document