Connect with us

Technology

See what AI really thinks of you with this deeply humbling website

Published

on

You are nothing more than a collection of deeply embarrassing and problematic machine learning-determined classifiers.

That humbling truth is brought home by ImageNet Roulette, an online tool that gives anyone bold or foolish enough to upload a photo the opportunity to learn just exactly how artificial intelligence sees them. The project, described as “a provocation” by its creators, aims to shed light on how artificial intelligence systems view and classify humans. 

And, surprise(!), AI has some pretty racist and misogynistic ideas about people. Or, rather, the dataset ImageNet Roulette draws from, ImageNet, is filled with problematic categories that reflect the bias often inherent in the large datasets that make machine learning possible.  

Calling attention to that fact is the project’s entire point. 

“[We] want to shed light on what happens when technical systems are trained on problematic training data,” explains the ImageNet Roulette website. “AI classifications of people are rarely made visible to the people being classified. ImageNet Roulette provides a glimpse into that process – and to show the ways things can go wrong.”

The project, which is part of Trevor Paglen’s and Kate Crawford’s Training Humans exhibition at Milan’s Fondazione Prada museum, identifies what it thinks are faces in photos and then labels them as it sees fit.

Often, these make no sense to the casual observer — such as in the case of the below photo, featuring former President Barack Obama and Prince Harry, labeled as “card player” and “sphinx,” respectively. 

Hmm.

Hmm.

Image: Composite: Samir Hussein / getty / imagenet roulette

“[Training Humans] is the first major photography exhibition devoted to training images: the collections of photos used by scientists to train artificial intelligence (AI) systems in how to ‘see’ and categorize the world,” explains the exhibit page. 

Uploading a personal photo into ImageNet Roulette is both an exercise in humility — it categorized a photo of this reporter as “flake, oddball, geek” — and a reminder that the systems making judgments about people based solely on photographs are, frankly, not that good. 

It’s the latter point that should cause concern. Automated systems that replicate, and by extension exacerbate, the biases present in society have the power to codify those very problems. ImageNet Roulette is a stark reminder that the AI powering image-recognition tools aren’t some digital arbiter of truth. 

Remember that the next time you hear someone waxing poetic about the powers of machine learning. 

Continue Reading
Advertisement Find your dream job

Trending