(credit: Adam Jones, Ph.D.)
In 2016, ProPublica caused a stir when it evaluated the performance of software that’s used in criminal justice proceedings. The software, which is used to evaluate a defendant’s chance of committing further crimes, turned out to produce different results when evaluating black people and caucasians.
The significance of that discrepancy is still the subject of some debate, but two Dartmouth College researchers have asked a more fundamental question: is the software any good? The answer they came up with is “not especially,” as its performance could be matched by recruiting people on Mechanical Turk, or performing a simple analysis that only took two factors into account.
Software and bias
The software in question is called COMPAS, for Correctional Offender Management Profiling for Alternative Sanctions. It takes into account a wide variety of factors about defendants, and uses them to evaluate whether those individuals are likely to commit additional crimes and helps identify intervention options. COMPAS is heavily integrated into the judicial process (see this document from the California Department of Corrections for a sense of its importance). Perhaps most significantly, however, it is sometimes influential in determining sentencing, which can be based on …read more
Source:: Ars Technica