학술논문

Investigating the Legality of Bias Mitigation Methods in the United Kingdom
Document Type
Periodical
Source
IEEE Technology and Society Magazine IEEE Technol. Soc. Mag. Technology and Society Magazine, IEEE. 42(4):87-94 Dec, 2023
Subject
General Topics for Engineers
Engineering Profession
Employment
Decision making
Algorithm design and analysis
Face recognition
Artificial intelligence
Law
Informatics
Machine learning
Ethics
Human factors
Classification algorithms
Physiology
Behavioral sciences
Psychology
Customer profiles
Language
ISSN
0278-0097
1937-416X
Abstract
Algorithmic Decision-Making Systems (ADMS) 1 fairness issues have been well highlighted over the past decade [1], including some facial recognition systems struggling to identify people of color [2]. In 2021, Uber drivers filed a claim with the U.K. ’s employment tribunal for unfair dismissal resulting from automated facial recognition technology by Microsoft [3]. Bias mitigation methods have been developed to reduce discrimination from ADMS. These typically operationalize fairness notions as fairness metrics to minimize discrimination [4]. We refer to ADMS to which bias mitigation methods have been applied as “mitigated ADMS” or, in the singular, a “mitigated system.”