Blog

Nov 05, 2025

Using Graph Transformers to Predict Business Process Completion Times

This work presents PGTNet, a new deep learning method for PPM that focuses on the crucial issue of predicting how long business process instances will run. Current state-of-the-art techniques, such as LSTMs, Transformers, and Graph Neural Networks (GNNs), find it difficult to address three major issues at once: learning intricate control-flow relationships (such as loops and parallelism), incorporating multiple process perspectives (such as case attributes), and capturing long-range dependencies in event sequences. To get around these restrictions, PGTNet uses a Process Graph Transformer Network after first converting event logs into a graph-oriented representation.

Source: HackerNoon →


Share

BTCBTC
$87,512.00
1.75%
ETHETH
$2,962.41
2.23%
USDTUSDT
$0.999
0.03%
BNBBNB
$849.56
1.19%
XRPXRP
$1.88
2.16%
USDCUSDC
$1.000
0.01%
SOLSOL
$124.10
1.74%
TRXTRX
$0.284
1.32%
STETHSTETH
$2,961.47
2.26%
DOGEDOGE
$0.130
1.25%
FIGR_HELOCFIGR_HELOC
$1.04
0.37%
ADAADA
$0.365
1.02%
WBTWBT
$56.96
1.68%
BCHBCH
$582.19
0.58%
WSTETHWSTETH
$3,621.41
2.26%
WBTCWBTC
$87,383.00
1.63%
WBETHWBETH
$3,218.47
2.25%
USDSUSDS
$1.000
0%
WEETHWEETH
$3,212.00
2.23%
BSC-USDBSC-USD
$0.999
0.05%