Blog

Nov 05, 2025

Using Graph Transformers to Predict Business Process Completion Times

This work presents PGTNet, a new deep learning method for PPM that focuses on the crucial issue of predicting how long business process instances will run. Current state-of-the-art techniques, such as LSTMs, Transformers, and Graph Neural Networks (GNNs), find it difficult to address three major issues at once: learning intricate control-flow relationships (such as loops and parallelism), incorporating multiple process perspectives (such as case attributes), and capturing long-range dependencies in event sequences. To get around these restrictions, PGTNet uses a Process Graph Transformer Network after first converting event logs into a graph-oriented representation.

Source: HackerNoon →


Share

BTCBTC
$70,619.00
0.69%
ETHETH
$2,074.50
1.22%
USDTUSDT
$1.00
0%
BNBBNB
$652.51
0.53%
XRPXRP
$1.39
0.29%
USDCUSDC
$1.000
0.01%
SOLSOL
$86.87
2%
TRXTRX
$0.297
2.15%
FIGR_HELOCFIGR_HELOC
$1.00
0.54%
DOGEDOGE
$0.0945
1.63%
WBTWBT
$55.18
0.82%
USDSUSDS
$1.000
0.03%
ADAADA
$0.259
2.88%
BCHBCH
$461.80
0.37%
HYPEHYPE
$37.89
3.33%
LEOLEO
$9.07
0.14%
XMRXMR
$356.99
0.18%
LINKLINK
$9.00
1.49%
USDEUSDE
$1.00
0.01%
CCCC
$0.150
1.22%