Rethinking business process simulation: A utility-based evaluation framework


Özdemir, Konrad ; Kirchdorfer, Lukas ; Amiri Elyasi, Keyvan ; van der Aa, Han ; Stuckenschmidt, Heiner



DOI: https://doi.org/10.1007/978-3-032-02929-4_8
URL: https://www.springerprofessional.de/en/rethinking-...
Dokumenttyp: Konferenzveröffentlichung
Erscheinungsjahr: 2026
Buchtitel: Business Process Management Forum : BPM 2025 Forum, Seville, Spain, August 31 - September 5, 2025, Proceedings
Titel einer Zeitschrift oder einer Reihe: Lecture Notes in Business Information Processing : LNBIP
Band/Volume: 564
Veranstaltungstitel: Business Process Management Forum (BPM 2025 Forum)
Veranstaltungsort: Sevilla, Spain
Veranstaltungsdatum: 31.08.-05.09.2025
Herausgeber: Senderovich, Arik ; Cabanillas, Cristina ; Vanderfeesten, Irene ; Reijers, Hajo A.
Ort der Veröffentlichung: Berlin [u.a.]
Verlag: Springer
ISBN: 978-3-032-02928-7 , 978-3-032-02929-4
ISSN: 1865-1348 , 1865-1356
Sprache der Veröffentlichung: Englisch
Einrichtung: Fakultät für Wirtschaftsinformatik und Wirtschaftsmathematik > Practical Computer Science II: Artificial Intelligence (Stuckenschmidt 2009-)
Fachgebiet: 004 Informatik
Freie Schlagwörter (Englisch): process simulation , process mining , deep learning
Abstract: Business process simulation (BPS) is a key tool for analyzing and optimizing organizational workflows, supporting decision-making by estimating the impact of process changes. The reliability of such estimates depends on the ability of a BPS model to accurately mimic the process under analysis, making rigorous accuracy evaluation essential. However, the state-of-the-art approach to evaluating BPS models has two key limitations. First, it treats simulation as a forecasting problem, testing whether models can predict unseen future events. This fails to assess how well a model captures the as-is process, particularly when process behavior changes from train to test period. Thus, it becomes difficult to determine whether poor results stem from an inaccurate model or the inherent complexity of the data, such as unpredictable drift. Second, the evaluation approach strongly relies on Earth Mover’s Distance-based metrics, which can obscure temporal patterns and thus yield misleading conclusions about simulation quality. To address these issues, we propose a novel framework that evaluates simulation quality based on its ability to generate representative process behavior. Instead of comparing simulated logs to future real-world executions, we evaluate whether predictive process monitoring models trained on simulated data perform comparably to those trained on real data for downstream analysis tasks. Empirical results show that our framework not only helps identify sources of discrepancies but also distinguishes between model accuracy and data complexity, offering a more meaningful way to assess BPS quality.




Dieser Eintrag ist Teil der Universitätsbibliographie.




Metadaten-Export


Zitation


+ Suche Autoren in

+ Aufruf-Statistik

Aufrufe im letzten Jahr

Detaillierte Angaben



Sie haben einen Fehler gefunden? Teilen Sie uns Ihren Korrekturwunsch bitte hier mit: E-Mail


Actions (login required)

Eintrag anzeigen Eintrag anzeigen