Blog

8 hours ago

Using Graph Transformers to Predict Business Process Completion Times

This work presents PGTNet, a new deep learning method for PPM that focuses on the crucial issue of predicting how long business process instances will run. Current state-of-the-art techniques, such as LSTMs, Transformers, and Graph Neural Networks (GNNs), find it difficult to address three major issues at once: learning intricate control-flow relationships (such as loops and parallelism), incorporating multiple process perspectives (such as case attributes), and capturing long-range dependencies in event sequences. To get around these restrictions, PGTNet uses a Process Graph Transformer Network after first converting event logs into a graph-oriented representation.

Source: HackerNoon →


Share

BTCBTC
$103,457.00
2.36%
ETHETH
$3,424.11
5.11%
USDTUSDT
$1.00
0.04%
XRPXRP
$2.35
6.76%
BNBBNB
$954.61
2.33%
SOLSOL
$161.87
5.34%
USDCUSDC
$1.000
0.01%
STETHSTETH
$3,430.23
5.61%
TRXTRX
$0.289
1.4%
DOGEDOGE
$0.167
2.95%
ADAADA
$0.544
4.47%
WSTETHWSTETH
$4,187.53
5.88%
FIGR_HELOCFIGR_HELOC
$1.03
0.25%
WBTCWBTC
$103,865.00
2.75%
WBETHWBETH
$3,710.71
5.41%
WBTWBT
$52.53
1.56%
HYPEHYPE
$41.15
6.62%
LINKLINK
$15.15
3.1%
BCHBCH
$490.19
1.75%
USDSUSDS
$1.000
0.11%