Artificial intelligence (AI) carries a lot of bias. It’s built on top of human biases, and, rather than filtering them out, extrapolates on them. Facial recognition, even before involving AI, also involved a lot of bias. Today’s facial recognition struggles with darker skin tones and women. In the U.S. this especially hurts Black and Latino people. Law enforcement has used this technology to arrest the wrong people. As a result, some cities, like Portland, have moved to ban the use of facial recognition.
A Twitter user reported how Zoom would usually erase his coworker from the screen if he tried to use a background. The thought many had, was that it was due to racist facial recognition. But, curiously, the photos provided by the tweet author just showed two photos of a white man.
Turns out that’s because Twitter on mobile was cropping out his black coworker. After trying with a few different images, Twitter users realized something: Twitter was using racist facial recognition in their auto-cropping software.
In an effort to report racist AI, one man accidentally discovered just how widespread it really is.
In This Article:
Zoom Backgrounds
Let’s start with what started this off: Zoom backgrounds. If you haven’t used Zoom yet, it’s a teleconferencing app. One of its features lets you hide your background and replace it with a nice photo. Perfect for people who have a mess behind them or can’t get privacy for a call and just don’t want to distract anyone. The system uses AI to detect a person’s face and body. It can mess up by cutting out items you may be holding, but it mostly works.
As long as you’re white.
Anyone who isn’t white, however, may face problems.
any guesses? pic.twitter.com/9aIZY4rSCX
— Colin Madland πΊπ¦ (@colinmadland) September 19, 2020
This raised a few eyebrows because, on Twitter mobile, you can only see one of the images in the Tweet. Can you guess which one?
The Crop
Geez…any guesses why @Twitter defaulted to show only the right side of the picture on mobile? pic.twitter.com/UYL7N3XG9k
— Colin Madland πΊπ¦ (@colinmadland) September 19, 2020
Turns out Twitter uses an algorithm to decide how to crop an image. It favors people, so photos on the service will crop to zoom in on a person, rather than their background. With the two photos in the same image, Zoom assumed Mr. Madland’s coworker was “background.”
He tried flipping the images, providing his own image on the left and his coworker’s on the right. Nothing brought his coworker into center frame.
Then people tried it for themselves. They found tall images could test the crop on Twitter’s website, so they uploaded extra tall images with a white person’s face and a Black person’s face.
Guess which ones Twitter prefers.
Trying a horrible experiment…
Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia
— Tony "Abolish ICE" Arcieri π¦πΉ (@bascule) September 19, 2020
Oh, come on, one was a president!
Not Just People?
So people decided to see just how far Twitter’s algorithm goes. At first, it was a joke, a meme. Mocking Twitter’s racism. They threw weird things at it. But soon people realized that it was, once again, far worse than they realized.
https://twitter.com/_jsimonovski/status/1307542747197239296?s=20
It seems it even might do it for labs.
No, not laboratories, the dogs.
I tried it with dogs. Let's see. pic.twitter.com/xktmrNPtid
— π π π ‘π πΏ (@MarkEMarkAU) September 20, 2020
Other Variables In Play
Twitter’s AI is a black box. We have no idea what’s inside or what it’s doing. For example, with a small amount of photo manipulation, this man was able to get Twitter to crop to a black man, rather than a white man in the same photo.
Here's another example of what I've experimented with. It's not a scientific test as it's an isolated example, but it points to some variables that we need to look into. Both men now have the same suits and I covered their hands. We're still investigating the NN. pic.twitter.com/06BhFgDkyA
— Dantley Davis (@dantley) September 20, 2020
However, it was in response to someone else trying to get it to go the other way with the exact same photos and various backgrounds and edits.
So, it’s not 100%, but most of the time, if there are people of different races in a photo, Twitter will focus on the white people, cropping out the rest.
Responses
The closest we’ve gotten to a response from Twitter is one tweet from Twitter communications employee Liz Kelley.
thanks to everyone who raised this. we tested for bias before shipping the model and didn't find evidence of racial or gender bias in our testing, but itβs clear that weβve got more analysis to do. we'll open source our work so others can review and replicate. https://t.co/E6sZV3xboH
— liz kelley (@lizkelley) September 20, 2020
Twitter will look into their bias, though they claim they already tested for it and couldn’t find any. Still, Twitter users seem to have no trouble testing it, even without knowing how the software works.
As for Zoom?
They have not responded on Twitter or their company blog as of this writing. Perhaps they’re just happy that Twitter ended up showing them up and stealing the attention.