Leaf&Core

Roller Rink Kicks Out Girl Over Mistaken Facial Recognition

Reading Time: 2 minutes.

We know people are diverse, so why don’t we push for diversity in important roles?

We’ve heard so many stories about how facial recognition is failing to remove human bias, but instead increasing it. AI, especially facial recognition, is creating a racially segregated world, one where Black and non-white people, especially women, have to worry about a computer system misidentifying them. There have been mistaken arrests and deportations over broken and racist facial recognition. Now a 14-year-old Black girl was kicked out of a roller rink, of all places, because of a false facial recognition match. The roller rink says the software reported a 97% match for the young girl, who hadn’t been at the roller rink ever before.

It’s clear, AI is creating a new form of segregation, and we’re not even done suffering the effects of other forms of segregation.

A Small Rink With a Big Problem

Riverside Arena is a small roller rink located in Detroit. It’s just a simple roller rink. But what’s a roller rink doing with a facial recognition system, anyway? Unfortunately, it’s the same reason any place has installed facial recognition or increased security past what’s necessary. Someone was scared, didn’t realize the racial bias in facial recognition (or worse, knew about it and didn’t care), and installed it, thinking it could make their business safer. Instead, it did what facial recognition, at least in the United States and many other western countries, always ends up doing: digital segregation.

Riverside Arena reports that their facial recognition showed a 97% match for Lamya Robinson to a person who was involved with a fight at Riverside Arena. The problem? Robinson had never been to the roller rink before. Riverside Arena says they only considered the percentage match, trusting the system rather than verifying it. They didn’t offer an unconditional apology.

“This is what we looked at, not the thumbnail photos Ms. Robinson took a picture of, if there was a mistake, we apologize for that.”

– Riverside Arena in Detroit, MI

Untold Bias

Perhaps the employees at this roller rink didn’t know the rather extreme bias built into facial recognition. In 2018, Microsoft and MIT found that gender-classifying facial recognition was 34% more likely to misidentify a dark-skinned woman than a light-skinned man. They may not have known of the high-profile arrests of Nijeer Parks or Robert Williams. At this point, however, not knowing these biases before using this technology is extremely irresponsible. Even casual research into the technology will reveal dangerous bias.

Developers of facial recognition may not set out to make a racist system, but their demographics lead to it. Software engineers, especially those at large companies or in management positions, are overwhelmingly white and male. These are people who may not think of particular racial biases they’re building into a system, simply because they lack the life experience that would lead to avoiding problems or testing particular cases. Until software engineering becomes more diverse and requires more diverse voices in the development of AI technologies, this is going to keep happening. As long as people are unaware of the bias in this software, they’re going to continue to rely on it. Their ignorance is hurting people, even children. That won’t stop until people stop using facial recognition and force these companies to make drastic changes.


Sources:
Exit mobile version