Published:
Tuesday, September 20, 2022By Jennifer Lee, Technology and Liberty Manager, ACLU-WA and Amy Hirotaka, Executive Director, Washington Association of Criminal Defense Lawyers
This blog post is part of a broader series that explores the use of Automated Decision Systems by government agencies in Washington state.
Juries are central to the American judicial system, and a representative jury is necessary for a robust democracy and to ensure all have access to a fair trial. Racial disparities in jury pools exacerbate longstanding racial disparities within our criminal legal system. A Duke University study conducted on Florida juries and trial outcomes found evidence that juries selected from a pool of all-white jurors convict Black defendants 16% more often than white defendants. However, this disparity in conviction rates is entirely eliminated when the jury pool includes just one Black member. Research also suggests that diverse juries are better decision makers. But the reality is that much of the jury selection process is rife with racial discrimination. Peremptory challenges have been used to exclude potential jurors of color, leading to the frequent presence of all white juries, and the low rate of juror pay results in a disproportionately high rate of potential jurors of color being unable to serve due to lower wage earnings.
The use of automated decision systems (ADS) in the jury selection process can amplify existing inequities and racial disparities while also hiding the same inequities and disparities. According to Bill Howe, Associate Professor at the University of Washington Information School, the use of automated decision systems can “amplify, operationalize and legitimize discrimination and opacity because people are less likely to challenge or question an ADS than they are a human discretionary process—i.e. ‘If the computer said so, it must be right.’” This phenomenon, where computers are trusted over human decision-makers — even when those computers may operate inaccurately — is called automation bias. Automation bias can further legitimize existing problems.
In Washington state, several courts use automated decision systems, including applications that are controlled and kept secret by corporations to form jury pools. The City of Federal Way uses in-house software called Jury Master. Snohomish County bought a jury selection program called Jury+ from a private vendor, Jury Systems Inc., which uses an algorithm approved by the state to produce random juror selections. Washington law does not require transparency or accountability when government entities use automated decision systems. Unfortunately, experience in other states shows that the use of automated decision systems to create jury pools can further racial disparities, both through random errors and intentional programmatic discrimination.
In Connecticut, for example, errors in the state’s automated jury system resulted in a three-year period where not a single individual from the two largest cities of Hartford and New Britain were summoned for jury duty. The computer’s error regarding Hartford was egregious – it was treating the “d” at the end of Hartford in a person’s address as evidence the person was dead and therefore not eligible as a juror. Those two missing cities alone accounted for 63% of the eligible Black population and 68% of the eligible Latinx population in the jury district. In Tulsa, Oklahoma, a Black man was tried by an all-white jury, pulled from a pool of 200 all-white jurors, after the state-contracted jury selection company accidentally excluded zip codes where 90% of the district’s Black residents resided.
In Allen County, Indiana, the state-contracted jury selection company utilized a jury pool system that pulled names of eligible jurors alphabetically by township and stopped when it reached 10,000 names. However, as discovered in Azania v. State of Indiana, 75% of Black residents in the county lived in Wayne township, at the end of the alphabet, meaning their chances of being included on a jury were reduced by 50%. As a result, the system excluded 87% of Wayne Township voters from jury service. The Indiana Supreme Court held that this error violated Indiana’s requirement of an “impartial and random selection” process. Unlike in Connecticut and Oklahoma, this was not a random error but rather an intentional programming decision that excluded people of color from their right to serve on a jury.
It is difficult to know how the automated jury selections tools used by jurisdictions in Washington work or what their impact is because currently there is not a law requiring transparency or disclosure that would daylight any potential discriminatory or disparate impact the jury selection application may have on our communities of color.
Any risk that a jury selection application may add another layer of discriminatory exclusion of potential jurors of color should compel public disclosure of how any such application works. This is necessary to ensure that the jury pool selection process is not unknowingly excluding potential jurors of color and that if there is evidence of discriminatory impact or application, the jury selections tools can be abandoned, altered, or fixed to eliminate harm.
In future blog posts, we will continue to explore issues of transparency and accountability with ADS as it relates to other facets of our lives.
This blog post is part of a broader series that explores the use of Automated Decision Systems by government agencies in Washington state.
Juries are central to the American judicial system, and a representative jury is necessary for a robust democracy and to ensure all have access to a fair trial. Racial disparities in jury pools exacerbate longstanding racial disparities within our criminal legal system. A Duke University study conducted on Florida juries and trial outcomes found evidence that juries selected from a pool of all-white jurors convict Black defendants 16% more often than white defendants. However, this disparity in conviction rates is entirely eliminated when the jury pool includes just one Black member. Research also suggests that diverse juries are better decision makers. But the reality is that much of the jury selection process is rife with racial discrimination. Peremptory challenges have been used to exclude potential jurors of color, leading to the frequent presence of all white juries, and the low rate of juror pay results in a disproportionately high rate of potential jurors of color being unable to serve due to lower wage earnings.
The use of automated decision systems (ADS) in the jury selection process can amplify existing inequities and racial disparities while also hiding the same inequities and disparities. According to Bill Howe, Associate Professor at the University of Washington Information School, the use of automated decision systems can “amplify, operationalize and legitimize discrimination and opacity because people are less likely to challenge or question an ADS than they are a human discretionary process—i.e. ‘If the computer said so, it must be right.’” This phenomenon, where computers are trusted over human decision-makers — even when those computers may operate inaccurately — is called automation bias. Automation bias can further legitimize existing problems.
In Washington state, several courts use automated decision systems, including applications that are controlled and kept secret by corporations to form jury pools. The City of Federal Way uses in-house software called Jury Master. Snohomish County bought a jury selection program called Jury+ from a private vendor, Jury Systems Inc., which uses an algorithm approved by the state to produce random juror selections. Washington law does not require transparency or accountability when government entities use automated decision systems. Unfortunately, experience in other states shows that the use of automated decision systems to create jury pools can further racial disparities, both through random errors and intentional programmatic discrimination.
In Connecticut, for example, errors in the state’s automated jury system resulted in a three-year period where not a single individual from the two largest cities of Hartford and New Britain were summoned for jury duty. The computer’s error regarding Hartford was egregious – it was treating the “d” at the end of Hartford in a person’s address as evidence the person was dead and therefore not eligible as a juror. Those two missing cities alone accounted for 63% of the eligible Black population and 68% of the eligible Latinx population in the jury district. In Tulsa, Oklahoma, a Black man was tried by an all-white jury, pulled from a pool of 200 all-white jurors, after the state-contracted jury selection company accidentally excluded zip codes where 90% of the district’s Black residents resided.
In Allen County, Indiana, the state-contracted jury selection company utilized a jury pool system that pulled names of eligible jurors alphabetically by township and stopped when it reached 10,000 names. However, as discovered in Azania v. State of Indiana, 75% of Black residents in the county lived in Wayne township, at the end of the alphabet, meaning their chances of being included on a jury were reduced by 50%. As a result, the system excluded 87% of Wayne Township voters from jury service. The Indiana Supreme Court held that this error violated Indiana’s requirement of an “impartial and random selection” process. Unlike in Connecticut and Oklahoma, this was not a random error but rather an intentional programming decision that excluded people of color from their right to serve on a jury.
It is difficult to know how the automated jury selections tools used by jurisdictions in Washington work or what their impact is because currently there is not a law requiring transparency or disclosure that would daylight any potential discriminatory or disparate impact the jury selection application may have on our communities of color.
Any risk that a jury selection application may add another layer of discriminatory exclusion of potential jurors of color should compel public disclosure of how any such application works. This is necessary to ensure that the jury pool selection process is not unknowingly excluding potential jurors of color and that if there is evidence of discriminatory impact or application, the jury selections tools can be abandoned, altered, or fixed to eliminate harm.
In future blog posts, we will continue to explore issues of transparency and accountability with ADS as it relates to other facets of our lives.