Connect with us

Technology

IBM pulls its facial recognition software from law enforcement use

Published

on

Artificial intelligence powerhouse IBM has had it with law enforcement’s misuse of facial recognition technology.

In a letter to Congress this week, IBM CEO Arvind Krishna outlined the different ways the tech company intends to address racial injustice and police abuse. “IBM would like to work with Congress in pursuit of justice and racial equity, focused initially in three key policy areas: police reform, responsible use of technology, and broadening skills and educational opportunities,” Krishna wrote.

As part of its effort to responsibly use technology, Krishna explained how facial recognition tools are misused for mass surveillance, racial profiling, and human rights violations, and said the company would no longer offer “general purpose IBM facial recognition or analysis software.”

The letter also acknowledged the inherent bias built into these tools and called for more testing and reporting on how they are used and often abused. In the past, IBM’s own tools have come under scrutiny for how they were used to train systems about race and gender. In March of 2019, the company was caught scraping millions of Creative Commons-licensed Flickr photos, without acquiring the permission of the people photographed, as part of a diversity initiative to combat AI bias.

It only took mass nationwide protests after the police killing of George Floyd for IBM to finally learn its own lessons about creating responsible artificial intelligence tools and software.

Continue Reading
Advertisement Find your dream job

Trending