We Need a Face Surveillance Moratorium, Not Weak Regulations: Concerns about SB 6280

Published: 
Tuesday, March 31, 2020

For the past year and a half, the ACLU and over a hundred organizations across the U.S. and around the world have been calling for moratoria and bans on face surveillance. Many cities in the U.S. have already banned this technology, and a moratorium is being considered at the federal level.  

Last year, we supported a face surveillance moratorium bill (HB 1654) sponsored by Representative Ryu. This year, we supported both HB 1654 and another moratorium bill (HB 2856) sponsored by Representative Entenman that would have halted use of this racially biased technology while impacted community members decided if, and not just how, face surveillance technology should be used. Alternative regulations supported by big tech companies and opposed by impacted communities do not provide adequate protections—in fact, they threaten to legitimize the infrastructural expansion of powerful face surveillance technology.

This is why we strongly opposed SB 6280, which purports to put safeguards around the use of facial recognition technology but does just the opposite. The first iteration of this bill, strongly supported by Microsoft, wasn’t just weak—it actually undermined existing state constitutional privacy protections.

Other iterations of the bill incorporated improvements advocated by lawmakers in the House, but many key issues remained unaddressed. For example, all versions of SB 6280 have included language allowing agencies to use face surveillance technology to deny people essential services and basic necessities such as housing, health care, food, and water. 

While the bill that passed out of the legislature created a task force with seats for community representatives, even this minimum provision was vetoed in the bill signing, removing any semblance of community oversight.

During the interim, lawmakers must turn immediately to the work of addressing the bill’s many problems:

(1) SB 6280 does not contain a moratorium.

We continue to call for a temporary ban on face surveillance so that impacted communities—not tech vendors and law enforcement—can decide if and how this technology should be used. Not only do face surveillance algorithms systematically misidentify women, gender minorities, Black and Indigenous individuals, and other people of color, but academics have also called perfectly accurate face surveillance “the perfect tool for oppression,” and “the most uniquely dangerous surveillance mechanism ever invented.” Surveillance tools have always had disproportionate impacts on historically marginalized communities, and face surveillance gives the government unprecedented power to police the movements and behaviors of entire populations. 

(2) There are no meaningful accountability or enforcement measures in the bill.

SB 6280 requires agencies currently using or intending to use face surveillance to complete “accountability reports,” but these reports do not provide any actual accountability. Face surveillance can be used while the report is written, and agencies are not required to have their reports approved by any regulatory body. Furthermore, there is no enforcement mechanism. If agencies choose not to follow the provisions in the bill, nothing will happen to them.

(3) SB 6280 allows agencies to use facial recognition to deny people essential services such as access to housing, health care, food, and water.

This bill states that agencies can use face surveillance to decide whether to approve or deny people access to financial and lending services, housing, insurance, education, criminal justice, employment, health care, and basic necessities as long as those decisions undergo loosely defined “meaningful human review.” This is not a sufficient safeguard.  This bill vaguely defines meaningful human review as “review or oversight by one or more individuals…who have the authority to alter the decision under review.” Unfortunately, humans are not free from bias and therefore are flawed checks to face surveillance systems. Additionally, people routinely succumb to automation bias, deferring to output from computer decision-support systems, rather than using their own judgment. Even when well defined, “meaningful human review” is a deeply flawed concept and should not be used to justify use of face surveillance technology to make critical decisions.

(4) SB 6280 regulates only three uses of face surveillance while many other uses such as enrollment, recognition, non-real-time identification, and verification are freely allowed.

The bill only requires a warrant or court order for “ongoing surveillance,” “persistent tracking,” and “real-time or near real-time identification.” This means that agencies may use face surveillance without any restrictions to surveil entire crowds at football stadiums, places of worship, or on public street corners, chilling people’s constitutionally protected rights and civil liberties.

We continue to call for a face surveillance moratorium—not weak regulations. During the interim and beyond, the ACLU of Washington will continue to work with directly impacted stakeholders, advocates, and lawmakers to push for a moratorium and community-centric tech policies.