Automated Decision Making Systems in Pre-trial Release Decisions

Published: 
Friday, October 1, 2021
**Sign in PRO on SSB 5116 (the regulation of automated decision making systems) before Tuesday, January 25 at 3:00 PM

This post is part of a series covering how government agencies in Washington are using automated decision-making systems that affect people’s lives in potentially harmful ways.  
One of the most harmful ways that an automated decision-making system can affect someone’s life is when it is used to keep them in jail instead of setting them free. When a defendant is detained in jail while they await their trial date, they risk losing their jobs, their homes, custody of their children, and, if they are detained during a pandemic, even their health or their life. These harsh consequences often lead defendants to accept unjust or unjustifiable plea agreements simply to be released, and the resulting criminal record can seriously diminish their future employment, housing, and other opportunities. The cascading harmful effects of pretrial detention can derail people’s lives forever, even if they are never convicted of a crime. 
 
As the Supreme Court has stated, “In our society, liberty is the norm, and detention prior to trial or without trial is the carefully limited exception.” However, according to the Bureau of Justice Statisticsbetween 1990 and 2004, 38% of state court felony defendants in the 75 largest counties in the U.S. were detained throughout the course of their case. Studies have found that defendants who were detained before trial were more than three times more likely to be sentenced to prison than defendants who were released at some point prior to trial.   
 
Although detention is supposed to be a “carefully limited exception,” some Washington courts are turning to automated decision-making systems to help them make the decision. According to Washington’s Pretrial Reform Task Force, courts in Spokane, Yakima, Whatcom, Thurston, Clark, and Pierce counties have considered using these “pretrial risk assessment tools,” as have municipal courts in Seattle, Federal Way, and Blaine.  
 
Most of these courts use or have used third-party pre-trial risk assessments, such as the Arnold Foundation’s Public Safety Assessment. Like most risk assessment tools, this tool relies on factors such as prior convictions and prior incarceration to assess how “likely” someone is to fail to appear for court or be charged with a new crime before their next court appearance. However, because our criminal legal system is deeply racist, these factors reflect existing racial bias in policing, arrests, prosecutions, conviction, and incarceration. This leads to an endless cycle of racial discrimination, in which Black and Brown communities are disproportionately policed, prosecuted, and incarcerated, and where courts base further decisions on the criminal legal system’s inherent discrimination. 
 
Washington courts are supposed to carefully consider many personal factors before deciding to keep a person in jail. In addition to considering the defendant’s criminal history, they are supposed to consider the defendant’s volunteer work in the community, participation in cultural activities, family ties, relationships, and reputation. They are also supposed to consider the willingness of other community members to vouch for the defendant’s reliability and help the defendant comply with release conditions. However, as Washington’s Pretrial Reform Task Force put it, “In reality, judges often have access to very little information to inform their pretrial release or detention decisions.”  
 
Given that courts often lack the individualized information they need to make subjective detention decisions, it is easy to understand how seemingly “objective” automated decision-making tools  may be appealing. However, like so many other facets of the criminal legal system, these tools provide only an appearance of scientific objectivity, and in reality, further reinforce and exacerbate racial bias. Despite the fact that automated decision systems are often just as biased as humans, researchers have found that humans tend to trust in a computer’s answers over human reasoning. This phenomenon, called “automation bias,” means that judges may defer to a computer’s suggestion, turning an automated recommendation into a trusted final decision. For these reasons, the ACLU and over 100 other organizations oppose the use of all pretrial risk assessment tools 
 
In Washington, potentially harmful automated decision-making systems are not being used just for pretrial detention decisions, but also for policing, distribution of medical benefits, housing decisions, and determining whether Medicaid claims are fraudulent. In future blog posts, we will explore how these systems are being used in Washington, and how the ACLU of Washington suggests bringing more transparency and accountability into their use.