Time Series Foundation Models for Process Model Forecasting

Reading time: 5 minute
...

📝 Original Info

  • Title: Time Series Foundation Models for Process Model Forecasting
  • ArXiv ID: 2512.07624
  • Date: 2025-12-08
  • Authors: Yongbo Yu, Jari Peeperkorn, Johannes De Smedt, Jochen De Weerdt

📝 Abstract

Process Model Forecasting (PMF) aims to predict how the control-flow structure of a process evolves over time by modeling the temporal dynamics of directly-follows (DF) relations, complementing predictive process monitoring that focuses on single-case prefixes. Prior benchmarks show that machine learning and deep learning models provide only modest gains over statistical baselines, mainly due to the sparsity and heterogeneity of the DF time series. We investigate Time Series Foundation Models (TSFMs), large pre-trained models for generic time series, as an alternative for PMF. Using DF time series derived from real-life event logs, we compare zero-shot use of TSFMs, without additional training, with fine-tuned variants adapted on PMF-specific data. TSFMs generally achieve lower forecasting errors (MAE and RMSE) than traditional and specialized models trained from scratch on the same logs, indicating effective transfer of temporal structure from non-process domains. While fine-tuning can further improve accuracy, the gains are often small and may disappear on smaller or more complex datasets, so zero-shot use remains a strong default. Our study highlights the generalization capability and data efficiency of TSFMs for process-related time series and, to the best of our knowledge, provides the first systematic evaluation of temporal foundation models for PMF.

💡 Deep Analysis

Figure 1

📄 Full Content

Time Series Foundation Models for Process Model Forecasting Yongbo Yu , Jari Peeperkorn , Johannes De Smedt , and Jochen De Weerdt Research Center for Information Systems Engineering (LIRIS), KU Leuven, Belgium {FirstName}.{LastName}@kuleuven.be Abstract. Process Model Forecasting (PMF) aims to predict how the control-flow structure of a process evolves over time by modeling the tem- poral dynamics of directly-follows (DF) relations, complementing predic- tive process monitoring that focuses on single-case prefixes. Prior bench- marks show that machine learning and deep learning models provide only modest gains over statistical baselines, mainly due to the sparsity and heterogeneity of the DF time series. We investigate Time Series Founda- tion Models (TSFMs), large pre-trained models for generic time series, as an alternative for PMF. Using DF time series derived from real-life event logs, we compare zero-shot use of TSFMs, without additional training, with fine-tuned variants adapted on PMF-specific data. TSFMs generally achieve lower forecasting errors (MAE and RMSE) than traditional and specialized models trained from scratch on the same logs, indicating ef- fective transfer of temporal structure from non-process domains. While fine-tuning can further improve accuracy, the gains are often small and may disappear on smaller or more complex datasets, so zero-shot use re- mains a strong default. Our study highlights the generalization capability and data efficiency of TSFMs for process-related time series and, to the best of our knowledge, provides the first systematic evaluation of temporal foundation models for PMF. Keywords: Process Model Forecasting · Process Mining · Time Series Forecasting · Foundation Models 1 Introduction Business Process Management (BPM) involves designing, executing, monitoring, and improving operational processes. With the increasing availability of event logs and the development of data-driven techniques, Process Mining (PM) has become an essential discipline for monitoring, analyzing, and enhancing real process be- havior from execution data. Within PM, Predictive Process Monitoring (PPM) leverages machine learning to predict future process behaviors [12,53], such as the next activity, remaining time, or outcome of an ongoing case. Despite notable progress, PPM mainly focuses on instance-level predictions and therefore offers limited insights into how the overall process structure evolves over time. Process Model Forecasting (PMF) has been proposed to address this limita- tion by predicting system-level dynamics [19], i.e. how the process model it- arXiv:2512.07624v1 [cs.LG] 8 Dec 2025 2 Y. Yu et al. self changes over time. Existing approaches represent process dynamics as time- indexed directly-follows graphs (DFGs) derived from event logs, where each DFG summarizes the control-flow relations observed in a specific time window. Each directly-follows (DF) relation can then be seen as a variable whose frequency evolves over time, so that DF frequencies together form a multivariate time series. Historical DF time series are used to forecast future DF frequencies, which are reassembled into a forecasted DFG that represents the anticipated process model at future time points. Recent work has explored multivariate machine learning and deep learning approaches for PMF [61,69], and introduced a unified bench- mark pipeline [62] for comparing forecasting methods. These benchmarks show that univariate approaches overall outperform multivariate ones and highlight the particularities of DF time series, including sparsity, heterogeneous seasonal and cyclical effects within the same event log, and patterns that are difficult to capture with a single model configuration. In parallel, foundation models have transformed learning paradigms across do- mains. Trained on large and diverse datasets with self-supervised objectives, they provide general-purpose representations that can be adapted to many down- stream tasks with limited task-specific training [8]. In PM, Large Language Mod- els (LLMs) have been applied to, among others, interpret business processes [37] and generate suffix predictions [48,49], showing their potential for semantic un- derstanding. However, temporal foundation models remain largely unexplored within this context. Time Series Foundation Models (TSFMs) such as Chronos [5], MOIRAI [59], and TimesFM [18] extend the foundation model paradigm to temporal data. Trained on vast collections of heterogeneous time series across do- mains, TSFMs learn generic temporal representationsthat enable strong zero-shot forecasting, i.e. accurate predictions for unseen datasets without additional train- ing. Recent work has further adapted TSFMs to specialized domains, for example healthcare signals [27] and energy dispatch [6], often through parameter-efficient fine-tuning (PEFT) techniques that provide performance gains on out-of-domain data. Since process model evolution can be represented as structured ti

📸 Image Gallery

page_1.png page_2.png page_3.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut