Why Organization Matters in “Algorithmic Discrimination”
ISSN der Zeitschrift
Research into “algorithmic discrimination” has largely dismissed the fact that algorithms are often developed and used by organizations. In this article, we show that organizational sociology can contribute to a more nuanced perspective on “algorithmic decision-making.” Drawing on the concept of decision premises, we differentiate between various formal structures, particularly between different decision programs (conditional and purposive). This allows us to challenge two key assumptions, namely that human decision-makers rely heavily on algorithmically generated recommendations and that discrimination against protected groups needs to be solved mainly at the level of code. We identify the usefulness of distinguishing between conditional and purposive decision programs via a case study centered on the legal context: the risk assessment software “Correctional Offender Management Profiling for Alternative Sanctions” (COMPAS) that is employed in the US criminal justice system to inform judicial personnel about the recidivism risk of defendants. By analyzing the organizational structures, according to which the COMPAS score is formally and informally embedded in courts, we point out that the score represents an ambiguous and redundant information source for judges. The practice of minimizing the relevance of the score and decoupling it from the legal reasoning backstage particularly reflects the professional decision autonomy of judges, which is inherent in the legal system. The core finding of our approach is that strategies to reduce discrimination should not only scrutinize data quality or the statistical model but also consider the specific forms, functions, and consequences of the organizational structures that condition the ways in which discriminatory differences may or may not be (re)produced.