Case Study 4 (Algorithms Part 2)


In discussing the social, political, and economic ramifications of algorithms, we’ve seen the vast number of ways in which they affect our lives and the lives of those around us. However, what happens when an algorithm is implicated in serious, widespread inequality?

This case study will pit your table against COMPAS, an algorithm that is widely used to predict recidivism among defendants. The results from this algorithm are analyzed by judges when determining how to set bail and even how severe of a sentence to give. Your job will be to view 25 profiles of sex offenders from Mecklenburg County and to predict if they will reoffend. Everybody who has an accessible laptop computer should bring it to class.

CAUTION/TRIGGER WARNING: This case study involves viewing mugshots of convicted sex offenders as well as lists of the crimes they have been convicted of. Please take whatever steps you feel are necessary to protect your emotional well-being.

  1. Introduce the case study. (5 minutes)
  2. Each table will work together, so there is no need to move around. Together, the members of a table will view a presentation that is comprised of the profiles of 25 sex offenders, allocating a maximum of 45 seconds per offender. You must predict if the convict will commit another sex crime in the future. After the 45 seconds is up, the prediction cannot be changed. Spreadsheets will be provided for this purpose. (20 minutes)
  3. Calculate your table’s accuracy using the key. Then, scan the introduction and results sections of this study (but not before class!).  Were you able to beat COMPAS? How did your table compare to the random participants in the study? Discuss among your table the ways COMPAS could be improved from the perspective of your table’s given role.  (15 minutes)
  4. Share your thoughts with the class. Let’s have a discussion. (Remaining class time)
Skip to toolbar