Risk Assessments: Fair or Racist?
Date:  05-25-2016

One software program scores African Americans as higher risks to commit more crimes than whites
From ProPublica

Courtrooms across the U.S. use software to make decisions on defendants’ freedom. These programs are meant to take human bias out of decisions on bail, parole and even sentencing. But do they? We studied one of the more widely used programs and found that it’s biased against black defendants. This is the first story for #Machine Bias, a series on the hidden impact algorithms have on our lives.

Read the full article here.