Connect with us

Technology

Congress should pass ban on police using facial recognition technology

Published

on

Earlier this year, for the first time (that we know of), a false match by a facial recognition algorithm led to the arrest of an innocent man

Now, members of Congress are finally taking action. On Thursday, Sens. Ed Markey and Jeff Merkley, and Reps. Pramila Jayapal and Ayanna Pressley, all Democrats, introduced the Facial Recognition and Biometric Technology Moratorium Act of 2020. It’s the most aggressive move yet by Congress to limit the use of facial recognition by police, in this case, by banning federal law enforcement from using it and cutting off state and local police from federal grants if they fail to do the same. 

That it was an innocent Black man who was falsely accused and arrested is not a surprise. A federal study published last year found that facial recognition technology misidentified Black and Asian faces 10 to 100 times more often than white faces. 

The only “evidence” against Robert Julian-Borchak Williams, according to the New York Times, was an algorithm used by the Michigan State Police that matched his driver’s license photo with blurry surveillance footage. Police were sent to arrest a confused Williams on his front lawn, in front of his two young daughters and wife. 

After spending $1,000 on bail and 30 hours in jail, Williams was released by the Detroit Police Department. When the cops realized their mistake, the Times reports, a police officer said: “I guess the computer got it wrong.” 

Yeah, the computer got it wrong. 

George Floyd. Michael Brown. Eric Garner. The list of Black people killed by police is so long. Contrast that with the many incentives for people — tech CEOs hungry for lucrative contracts, politicians screaming “law and order,” cops who want an easy fix — to push facial recognition technology. It’s a terrifying and deadly combination. 

Putting pressure on corporations alone won’t fix the problem. Amazon’s “moratorium” on selling its facial recognition tech to police departments is vague and only lasts a year. IBM said it won’t sell facial recognition tech to police, while Microsoft said it would institute a similar ban until federal laws regulating it were in place. 

And there are plenty of other players in the industry. DataWorks Plus, which built the software that led to Williams’ arrest and uses an algorithm cited in the federal bias study, says it “provides solutions” to “more than 1,000 agencies, both large and small.” It doesn’t have a public-facing consumer business to worry about. Neither does Clearview AI — yes, that Clearview AI, the creepy company that scraped billions of photos from social media networks without asking permission. Public outrage doesn’t matter to them. They never promised to not be evil.  

That’s why lawmakers need to take action. The Facial Recognition and Biometric Technology Moratorium Act of 2020 bans federal law enforcement from using facial recognition technology. It also prevents state and local law enforcement agencies from accepting federal grants if they use the technology. 

The Electric Privacy Information Center (EPIC), a privacy and human rights non-profit, didn’t think past bills, including the recently proposed Justice in Policing Act, went far enough to prohibit use of facial recognition technology. Jeramie D. Scott, senior counsel at EPIC, said they were “too limited in their reach” or had “wide-ranging exceptions.” But it endorses the Facial Recognition and Biometric Technology Moratorium Act.

So does Fight for the Future. The digital rights non-profit said in a statement the bill “effectively bans law enforcement use of facial recognition in the United States,” and that Congress should pass itsoon as possible.” And the ACLU says the bill “should immediately pass.” 

Robert Julian-Borchak Williams survived his encounter with police. But the next facial recognition “match” might not. 

Continue Reading
Advertisement Find your dream job

Trending