학술논문

De-Biasing the Bias: Methods for Improving Disparity Assessments with Noisy Group Measurements
Document Type
Working Paper
Source
Subject
Statistics - Methodology
Language
Abstract
Health care decisions are increasingly informed by clinical decision support algorithms, but these algorithms may perpetuate or increase racial and ethnic disparities in access to and quality of health care. Further complicating the problem, clinical data often have missing or poor quality racial and ethnic information, which can lead to misleading assessments of algorithmic bias. We present novel statistical methods that allow for the use of probabilities of racial/ethnic group membership in assessments of algorithm performance and quantify the statistical bias that results from error in these imputed group probabilities. We propose a sensitivity analysis approach to estimating the statistical bias that allows practitioners to assess disparities in algorithm performance under a range of assumed levels of group probability error. We also prove theoretical bounds on the statistical bias for a set of commonly used fairness metrics and describe real-world scenarios where our theoretical results are likely to apply. We present a case study using imputed race and ethnicity from the Bayesian Improved Surname Geocoding (BISG) algorithm for estimation of disparities in a clinical decision support algorithm used to inform osteoporosis treatment. Our novel methods allow policy makers to understand the range of potential disparities under a given algorithm even when race and ethnicity information is missing and to make informed decisions regarding the implementation of machine learning for clinical decision support.