The Lefty Lawyer: AB 642 Would Increase Police Use of Facial Recognition Technology
9 minute readThe Lefty Lawyer returns to talk more about facial recognition technology—and the dangers of Assembly Bill 642, which purports to regulate its use by police, but actually greenlights it. AB 642 would also allow cops to police their own use of this invasive tech—a practice that has often led to the abuse of surveillance equipment.
When I was a fellow teaching at UCI Law in their Immigrant Rights Clinic, my students and I helped draft the complaint against Clearview AI, a facial recognition company employed by many law enforcement agencies throughout the country (including LBPD, which took advantage of several free trials of Clearview’s technology).
A complaint, the legal document that initiates a lawsuit, is an opportunity to introduce an issue to the public as well as to the court. More than most other stages of a lawsuit, the complaint allows you to tell a story. You introduce your cast of characters by naming the parties; you provide the story’s setting, in the form of important background context; you lay out the central conflict, or how the other side has violated the law; and you tell the judge how you believe the story should be resolved—you ask for a remedy.
The complaint drafting gave us ample opportunity to dig in and develop our arguments around what was wrong with facial recognition technology (FRT) in general and Clearview AI’s practices around it in particular.
My students, like many advocates, repeatedly honed in on FRT’s unreliability. As well they might since FRT is famously inaccurate and in particular misidentifies Black people, other people of color, and women more frequently than it does white men. Which is to say, FRT is pretty good at recognizing people who look like its creators and hit or miss with everyone else. This perhaps unsurprising bias has led to multiple false arrests, including one that made the news earlier this year.
However, I was not comfortable with relying too heavily on these arguments, useful though they were for illustrating some of the racism inherent in the technology. I asked my students: “Would we be ok with FRT if it were accurate 100% of the time?”
Steeped in knowledge of the frightening power FRT gives its users, they shuddered. No, clearly not. If anything, that would be worse, giving extremely fallible officers of a structurally unjust agency an infallible tool to identify and track people. We needed to make clear why FRT threatens our privacy, our rights, and our safety even when it works.
The importance of not leaning too heavily on accuracy arguments is now hitting home: improvements to FRT’s ability to correctly identify people reportedly influenced Assemblymember Phil Ting (D-San Francisco) to introduce AB 642, which grants police officers in California broad authority to use it. Indeed, the bill only allows police to use programs that the National Institute of Standards and Technology’s Face Recognition Vendor Test program has determined are at least 98% accurate. But while it is couched in the language of limiting the circumstances in which officers can use FRT, the bill actually (as one of my professors used to say about the Constitution) has more holes than Swiss cheese.
The very first circumstance in which an officer may use FRT under the law is when they “have reasonable suspicion to believe the person has committed a felony.” In a future column, I’ll discuss the bar-so-low-only-an-inchworm-could-limbo-beneath-it standard that is “reasonable suspicion,” but suffice it to say, most officers can conjure up reasonable suspicion most of the time. Just take a look at the way the LBPD has in the past used baggy clothes, cycling in a high-crime area, or even just the simple fact that it was dark outside as reasons to stop and frisk bicycle riders.
So reasonable suspicion is what cops require to stop someone and ask them questions. An officer needs more reason—what’s known as probable cause—to arrest someone. If AB 642 passes, a cop who has only a “reasonable suspicion” of a person can also subject that person to invasive, privacy-destroying facial recognition technology.
But what about the felony part of the requirement—is that a guardrail? No. Most offenses in California have misdemeanor and felony versions and the only variable is the degree of severity; if an officer has reasonable suspicion of the misdemeanor level, they could usually claim reasonable suspicion of the felony level. For example, grand theft in California can be charged as either a misdemeanor or a felony, based largely on the prosecutor’s discretion. Unfun fact, it is grand theft to steal $250 or more worth of avocados—or approximately one batch of guacamole if you shop at Erewhon. The slope from suspecting someone of shoplifting avocados to reasonable suspicion of a felony is slippery indeed.
So the very first circumstance in which officers may use FRT is already broad enough arguably to encompass… whatever they want. But the bill provides more opportunities, including to identify “any person who has been lawfully arrested, during the process of booking or during that person’s custodial detention.” This practice already takes place in LA County: police photograph every person whom they arrest and then use FRT to search their faces in a county-wide database, including the LBPD. But AB 642 enshrines it in state law.
Tomisin Oluwole
Face the Music, 2022
Acrylic on canvas
24 x 36 inches
Click here to check out our interview with Tomisin Oluwole, a literary and visual artist based in Long Beach.
Does AB 642 place any useful limits on FRT usage? Yes. The sections requiring the deletion of certain images in particular would be an improvement on the status quo. But regulation of this kind is a sleight of hand. It claims to restrict police use of FRT, when it actually sets up a framework for them to use it legally—and, in this iteration in particular—nearly unfettered. Regulating something is a form of permitting its use.
As Carmen-Nicole Cox of director of government affairs for ACLU California Action, which has vehemently opposed the bill from the beginning, put it, “AB 642 is not better than nothing, just as drinking bleach is not better than no COVID vaccine.” Last month, the Editorial Board of the OC Register, hardly a leftist publication, wrote an op-ed opposing AB 642 because it recognized that the bill would “expand the use” of facial recognition.
That raises the other unspoken assumption of regulation: you have to trust that the agency will follow the rules in order to believe that the rules will be effective. Beyond the bill’s own provisions, AB 642 also requires police agencies to establish their own regulations for using FRT. That’s a remarkable degree of trust, especially given the power of the technology. Please, cops, constrain your own authority and also please enforce the limits on your power that you yourselves wrote.
Here in Long Beach, where the City Council has so far failed to bring forth a surveillance technology ordinance like many other major cities across the state, the police have been left up to their own devices to regulate their use of FRT and other spy gear. The consequences: Cops have shared data from automatic license plate readers with federal immigration authorities likely in violation of state law and did not properly document their wide use of FRT during the 2020 George Floyd uprising.
Which brings us to enforcement. Who will enforce AB 642 when law enforcement breaks the rules? Law enforcement themselves, of course! But only if the investigating agency “finds that the circumstances surrounding the violation raise serious questions about whether or not the officer acted intentionally with respect to the violation” shall the agency then “initiate a proceeding to determine whether disciplinary action against the officer is warranted.” Translation: If you break the law, we will consider whether to maybe possibly discipline you. Strong stuff!
AB 642 does allow people to bring civil actions (that is, sue a law enforcement agency for money or for a court order stopping them from a particular action, or both), but only those who were themselves subject to identification or attempted identification. That might be a difficult thing to know about oneself. How, after all, would you find out that the police had illegally applied FRT to your photograph?
According to a report on FRT from Georgetown Law’s Center for Privacy and Technology: “Law enforcement agencies have not considered themselves under any obligation to disclose details about a face recognition search—or the fact that one was conducted at all—to defendants because they have labeled face recognition internally as an investigative lead only.”
But in reality, the report says police often use FRT as the sole basis for arrests. This is the real problem with FRT: cops shouldn’t have FRT because we cannot trust them. They’ve proven that time and time again, here in Long Beach and across the country. We cannot trust them with the power to instantly identify and track us. We cannot trust them with the power to permanently destroy our privacy. We cannot trust them to follow their own rules and we cannot trust them to follow AB 642.
The legislators who considered AB 642 in the Public Safety Committee, and Assemblymember Ting himself, acknowledged problems with the bill but say they are concerned with a “Wild West” in the absence of regulation. That is, they too are scared of the technology and of police agencies’ unrestricted ability to use it. But AB 642 only gives the illusion of limiting that power. Their fear is on the money, but the answer isn’t to tell cops how to use FRT legally; it’s to take it away entirely.
AB 642 attracted such opposition that it is currently held under submission in committee, usually a sign that legislators want to work more on the bill before sending it to the floor. But the paradigm it espouses, in which police departments largely regulate themselves, is already the norm here in Long Beach and in many other cities in California. People are right to oppose AB 642 and that opposition should translate to policies that actually limit police power: facial recognition bans.
Two city commissions have already called for either a ban or moratorium on facial recognition, but so far city management and the City Council have not taken up these recommendations.
Who is the Lefty Lawyer? Caitlin Bellis is an attorney who works primarily at the intersection of criminal and immigration law and policy for a national nonprofit. Caitlin clerked for the U.S. Court of Appeals for the Ninth Circuit and is a graduate of Yale Law School and Reed College. She lives in Long Beach with her partner and their dog.