scholarly journals Conformance Checking Process Mining SAP Modul SD (Sales and Distribution) dengan Metode Heuristic Miner

Sisfo ◽  
2020 ◽  
Vol 09 (02) ◽  
Author(s):  
Alexander Hestu Kusuma ◽  
Gunawan Gunawan ◽  
Joan Santoso
2017 ◽  
Vol 01 (01) ◽  
pp. 1630004 ◽  
Author(s):  
Asef Pourmasoumi ◽  
Ebrahim Bagheri

One of the most valuable assets of an organization is its organizational data. The analysis and mining of this potential hidden treasure can lead to much added-value for the organization. Process mining is an emerging area that can be useful in helping organizations understand the status quo, check for compliance and plan for improving their processes. The aim of process mining is to extract knowledge from event logs of today’s organizational information systems. Process mining includes three main types: discovering process models from event logs, conformance checking and organizational mining. In this paper, we briefly introduce process mining and review some of its most important techniques. Also, we investigate some of the applications of process mining in industry and present some of the most important challenges that are faced in this area.


2021 ◽  
Vol 11 (22) ◽  
pp. 10686
Author(s):  
Syeda Amna Sohail ◽  
Faiza Allah Bukhsh ◽  
Maurice van Keulen

Healthcare providers are legally bound to ensure the privacy preservation of healthcare metadata. Usually, privacy concerning research focuses on providing technical and inter-/intra-organizational solutions in a fragmented manner. In this wake, an overarching evaluation of the fundamental (technical, organizational, and third-party) privacy-preserving measures in healthcare metadata handling is missing. Thus, this research work provides a multilevel privacy assurance evaluation of privacy-preserving measures of the Dutch healthcare metadata landscape. The normative and empirical evaluation comprises the content analysis and process mining discovery and conformance checking techniques using real-world healthcare datasets. For clarity, we illustrate our evaluation findings using conceptual modeling frameworks, namely e3-value modeling and REA ontology. The conceptual modeling frameworks highlight the financial aspect of metadata share with a clear description of vital stakeholders, their mutual interactions, and respective exchange of information resources. The frameworks are further verified using experts’ opinions. Based on our empirical and normative evaluations, we provide the multilevel privacy assurance evaluation with a level of privacy increase and decrease. Furthermore, we verify that the privacy utility trade-off is crucial in shaping privacy increase/decrease because data utility in healthcare is vital for efficient, effective healthcare services and the financial facilitation of healthcare enterprises.


2021 ◽  
Vol 4 ◽  
Author(s):  
Rashid Zaman ◽  
Marwan Hassani ◽  
Boudewijn F. Van Dongen

In the context of process mining, event logs consist of process instances called cases. Conformance checking is a process mining task that inspects whether a log file is conformant with an existing process model. This inspection is additionally quantifying the conformance in an explainable manner. Online conformance checking processes streaming event logs by having precise insights into the running cases and timely mitigating non-conformance, if any. State-of-the-art online conformance checking approaches bound the memory by either delimiting storage of the events per case or limiting the number of cases to a specific window width. The former technique still requires unbounded memory as the number of cases to store is unlimited, while the latter technique forgets running, not yet concluded, cases to conform to the limited window width. Consequently, the processing system may later encounter events that represent some intermediate activity as per the process model and for which the relevant case has been forgotten, to be referred to as orphan events. The naïve approach to cope with an orphan event is to either neglect its relevant case for conformance checking or treat it as an altogether new case. However, this might result in misleading process insights, for instance, overestimated non-conformance. In order to bound memory yet effectively incorporate the orphan events into processing, we propose an imputation of missing-prefix approach for such orphan events. Our approach utilizes the existing process model for imputing the missing prefix. Furthermore, we leverage the case storage management to increase the accuracy of the prefix prediction. We propose a systematic forgetting mechanism that distinguishes and forgets the cases that can be reliably regenerated as prefix upon receipt of their future orphan event. We evaluate the efficacy of our proposed approach through multiple experiments with synthetic and three real event logs while simulating a streaming setting. Our approach achieves considerably higher realistic conformance statistics than the state of the art while requiring the same storage.


2018 ◽  
Vol 27 (02) ◽  
pp. 1850002
Author(s):  
Sung-Hyun Sim ◽  
Hyerim Bae ◽  
Yulim Choi ◽  
Ling Liu

In Big data and IoT environments, process execution generates huge-sized data some of which is subsequently obtained by sensors. The main issue in such areas has been the necessity of analyzing data in order to suggest enhancements to processes. In this regard, evaluation of process model conformance to the execution log is of great importance. For this purpose, previous reports on process mining approaches have advocated conformance checking by fitness measure, which is a process that uses token replay and node-arc relations based on Petri net. However, fitness measure so far has not considered statistical significance, but just offers a numeric ratio. We herein propose a statistical verification method based on the Kolmogorov–Smirnov (K–S) test to judge whether two different log datasets follow the same process model. Our method can be easily extended to determinations that process execution actually follows a process model, by playing out the model and generating event log data from it. Additionally, in order to solve the problem of the trade-off between model abstraction and process conformance, we also propose the new concepts of Confidence Interval of Abstraction Value (CIAV) and Maximum Confidence Abstraction Value (MCAV). We showed that our method can be applied to any process mining algorithm (e.g. heuristic mining, fuzzy mining) that has parameters related to model abstraction. We expect that our method will come to be widely utilized in many applications dealing with business process enhancement involving process-model and execution-log analyses.


2021 ◽  
pp. 73-82
Author(s):  
Dorina Bano ◽  
Tom Lichtenstein ◽  
Finn Klessascheck ◽  
Mathias Weske

Process mining is widely adopted in organizations to gain deep insights about running business processes. This can be achieved by applying different process mining techniques like discovery, conformance checking, and performance analysis. These techniques are applied on event logs, which need to be extracted from the organization’s databases beforehand. This not only implies access to databases, but also detailed knowledge about the database schema, which is often not available. In many real-world scenarios, however, process execution data is available as redo logs. Such logs are used to bring a database into a consistent state in case of a system failure. This paper proposes a semi-automatic approach to extract an event log from redo logs alone. It does not require access to the database or knowledge of the databaseschema. The feasibility of the proposed approach is evaluated on two synthetic redo logs.


2021 ◽  
Author(s):  
Ashok Kumar Saini ◽  
Ruchi Kamra ◽  
Utpal Shrivastava

Conformance Checking (CC) techniques enable us to gives the deviation between modelled behavior and actual execution behavior. The majority of organizations have Process-Aware Information Systems for recording the insights of the system. They have the process model to show how the process will be executed. The key intention of Process Mining is to extracting facts from the event log and used them for analysis, ratification, improvement, and redesigning of a process. Researchers have proposed various CC techniques for specific applications and process models. This paper has a detailed study of key concepts and contributions of Process Mining. It also helps in achieving business goals. The current challenges and opportunities in Process Mining are also discussed. The survey is based on CC techniques proposed by researchers with key objectives like quality parameters, perspective, algorithm types, tools, and achievements.


2018 ◽  
Vol 466 ◽  
pp. 55-91 ◽  
Author(s):  
Wai Lam Jonathan Lee ◽  
H.M.W. Verbeek ◽  
Jorge Munoz-Gama ◽  
Wil M.P. van der Aalst ◽  
Marcos Sepúlveda

Energies ◽  
2020 ◽  
Vol 13 (24) ◽  
pp. 6630
Author(s):  
Marcin Szpyrka ◽  
Edyta Brzychczy ◽  
Aneta Napieraj ◽  
Jacek Korski ◽  
Grzegorz J. Nalepa

Conformance checking is a process mining technique that compares a process model with an event log of the same process to check whether the current execution stored in the log conforms to the model and vice versa. This paper deals with the conformance checking of a longwall shearer process. The approach uses place-transition Petri nets with inhibitor arcs for modeling purposes. We use event log files collected from a few coal mines located in Poland by Famur S.A., one of the global suppliers of coal mining machines. One of the main advantages of the approach is the possibility for both offline and online analysis of the log data. The paper presents a detailed description of the longwall process, an original formal model we developed, selected elements of the approach’s implementation and the results of experiments.


Sign in / Sign up

Export Citation Format

Share Document