Local Post-Hoc Explanations for Predictive Process Monitoring in Manufacturing

22 Sep 2020  ·  Nijat Mehdiyev, Peter Fettke ·

This study proposes an innovative explainable predictive quality analytics solution to facilitate data-driven decision-making for process planning in manufacturing by combining process mining, machine learning, and explainable artificial intelligence (XAI) methods. For this purpose, after integrating the top-floor and shop-floor data obtained from various enterprise information systems, a deep learning model was applied to predict the process outcomes. Since this study aims to operationalize the delivered predictive insights by embedding them into decision-making processes, it is essential to generate relevant explanations for domain experts. To this end, two complementary local post-hoc explanation approaches, Shapley values and Individual Conditional Expectation (ICE) plots are adopted, which are expected to enhance the decision-making capabilities by enabling experts to examine explanations from different perspectives. After assessing the predictive strength of the applied deep neural network with relevant binary classification evaluation measures, a discussion of the generated explanations is provided.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here