Algorithm-based sentencing and discrimination

Research output: Contribution to book/anthology/report/proceedingBook chapterResearchpeer-review

Abstract

US courts are increasingly using algorithm-based recidivism risk prediction instruments in estimating offenders’ dangerousness and, thus, the warranted severity of the punishment. Some argue that this practice mitigates well-known biases in non-algorithm-based recidivism risk assessments. Whether this is so or not, in the present US context, algorithm-based sentencing is quite likely to be unfairly discriminatory. This claim might have radical implications regarding the US penal system in general (as well as that of all other countries); to wit, that it too is unfairly discriminatory against groups with high base crime rates. If so, we face a dilemma forcing us to revise some of our beliefs about penal justice.

Original languageEnglish
Title of host publicationSentencing and Artificial Intelligence
Number of pages23
Place of publicationNew York
PublisherOxford University Press
Publication dateFeb 2022
Pages74-96
Chapter5
ISBN (Print)9780197539538
ISBN (Electronic)9780197539538
DOIs
Publication statusPublished - Feb 2022

Keywords

  • Algorithm-based sentencing
  • Bias
  • Discrimination
  • Indirect discrimination
  • Recidivism
  • Statistical discrimination

Fingerprint

Dive into the research topics of 'Algorithm-based sentencing and discrimination'. Together they form a unique fingerprint.

Cite this