TY - JOUR
T1 - FAT Forensics
T2 - A Python toolbox for algorithmic fairness, accountability and transparency
AU - Sokol, Kacper
AU - Santos-Rodriguez, Raul
AU - Flach, Peter
PY - 2022/9/2
Y1 - 2022/9/2
N2 - Today, artificial intelligence systems driven by machine learning algorithms can be in a position to take important, and sometimes legally binding, decisions about our everyday lives. In many cases, however, these systems and their actions are neither regulated nor certified. To help counter the potential harm that such algorithms can cause we developed an open source toolbox that can analyse selected fairness, accountability and transparency aspects of the machine learning process: data (and their features), models and predictions, allowing to automatically and objectively report them to relevant stakeholders. In this paper we describe the design, scope, usage and impact of this Python package, which is published under the 3-Clause BSD open source licence.
AB - Today, artificial intelligence systems driven by machine learning algorithms can be in a position to take important, and sometimes legally binding, decisions about our everyday lives. In many cases, however, these systems and their actions are neither regulated nor certified. To help counter the potential harm that such algorithms can cause we developed an open source toolbox that can analyse selected fairness, accountability and transparency aspects of the machine learning process: data (and their features), models and predictions, allowing to automatically and objectively report them to relevant stakeholders. In this paper we describe the design, scope, usage and impact of this Python package, which is published under the 3-Clause BSD open source licence.
U2 - 10.1016/j.simpa.2022.100406
DO - 10.1016/j.simpa.2022.100406
M3 - Article (Academic Journal)
SN - 2665-9638
VL - 14
JO - Software Impacts
JF - Software Impacts
M1 - 100406
ER -