Blog
8 hours ago
Using Graph Transformers to Predict Business Process Completion Times
This work presents PGTNet, a new deep learning method for PPM that focuses on the crucial issue of predicting how long business process instances will run. Current state-of-the-art techniques, such as LSTMs, Transformers, and Graph Neural Networks (GNNs), find it difficult to address three major issues at once: learning intricate control-flow relationships (such as loops and parallelism), incorporating multiple process perspectives (such as case attributes), and capturing long-range dependencies in event sequences. To get around these restrictions, PGTNet uses a Process Graph Transformer Network after first converting event logs into a graph-oriented representation.
Source: HackerNoon →