학술논문

Challenges and Opportunities of Shapley values in a Clinical Context
Document Type
Working Paper
Source
ICML 2022 Workshop on Interpretable Machine Learning in Healthcare
Subject
Statistics - Methodology
Language
Abstract
With the adoption of machine learning-based solutions in routine clinical practice, the need for reliable interpretability tools has become pressing. Shapley values provide local explanations. The method gained popularity in recent years. Here, we reveal current misconceptions about the ``true to the data'' or ``true to the model'' trade-off and demonstrate its importance in a clinical context. We show that the interpretation of Shapley values, which strongly depends on the choice of a reference distribution for modeling feature removal, is often misunderstood. We further advocate that for applications in medicine, the reference distribution should be tailored to the underlying clinical question. Finally, we advise on the right reference distributions for specific medical use cases.