Blog

Nov 05, 2025

Using Graph Transformers to Predict Business Process Completion Times

This work presents PGTNet, a new deep learning method for PPM that focuses on the crucial issue of predicting how long business process instances will run. Current state-of-the-art techniques, such as LSTMs, Transformers, and Graph Neural Networks (GNNs), find it difficult to address three major issues at once: learning intricate control-flow relationships (such as loops and parallelism), incorporating multiple process perspectives (such as case attributes), and capturing long-range dependencies in event sequences. To get around these restrictions, PGTNet uses a Process Graph Transformer Network after first converting event logs into a graph-oriented representation.

Source: HackerNoon →


Share

BTCBTC
$69,099.00
1.25%
ETHETH
$2,082.60
1.85%
USDTUSDT
$0.999
0.02%
BNBBNB
$648.01
0.49%
XRPXRP
$1.43
1.48%
USDCUSDC
$1.000
0.01%
SOLSOL
$87.60
1.68%
TRXTRX
$0.277
1.49%
DOGEDOGE
$0.0980
0.17%
FIGR_HELOCFIGR_HELOC
$1.03
0.44%
WBTWBT
$52.66
0.09%
BCHBCH
$522.68
0.21%
ADAADA
$0.272
0.35%
USDSUSDS
$1.000
0.01%
HYPEHYPE
$31.59
0.5%
LEOLEO
$7.79
5.44%
USDEUSDE
$0.998
0.09%
CCCC
$0.168
3.64%
LINKLINK
$8.90
0.67%
XMRXMR
$330.59
2.33%