Less Discriminatory Algorithms

  • Emily Black ,
  • John Logan Koepke ,
  • Pauline Kim ,
  • ,
  • Mingwei Hsu

FAccT 2024 |

DOI

Entities that use algorithmic systems in traditional civil rights domains like housing, employment, and credit should have a duty to search for and implement less discriminatory algorithms (LDAs). Why? Work in computer science has established that, contrary to conventional wisdom, for a given prediction problem there are almost always multiple possible models with equivalent performance—a phenomenon termed model multiplicity. Critically for our purposes, different models of equivalent performance can produce different predictions for the same individual, and, in aggregate, exhibit different levels of impacts across demographic groups. As a result, when an algorithmic system displays a disparate impact, model multiplicity suggests that developers may be able to discover an alternative model that performs equally well, but has less discriminatory impact. Indeed, the promise of model multiplicity is that an equally accurate, but less discriminatory alternative algorithm almost always exists. But without dedicated exploration, it is unlikely developers will discover potential LDAs.