Racially Biased Facial Recognition Put a Man Behind Bars… Again

Reading Time: 5 minutes.
A number of people with points of their face identified and connected with lines used in facial recognition.

Facial recognition hides many biases. Image: Microsoft

New Jersey doesn’t allow their police to use facial recognition anymore. That’s likely thanks to a case that highlighted the racial bias of the facial recognition they were using. Put racist AI together with overzealous police who may hold the same biases, and you have a recipe for disaster. This is the third case—that we know about—of a Black man facing charges for a crime he did not commit due to bad facial recognition technology and lazy or biased police.

An innocent man spent 10 days behind bars because, despite knowing facial recognition has racial bias for years now, police decided to continue using it and make arrests with no other evidence. It’s not hard to draw a conclusion from that.

The man is now suing the police department for using the racist software. His lawsuit could help end the practice of policing leaning on known racist technologies. His story is especially terrifying, because he almost went to prison for 20 years for a crime he didn’t commit. All because when it comes to arresting Black men, police seem to be comfortable with no evidence besides the output from some racist AI.

Again, draw your own conclusions.

An Innocent Man Arrested

Police departments continue to use their proven racist facial recognition technologies because they claim to use the AI only as a clue. They claim it’s not used as their only piece of evidence for arrests. Yet, in at least three cases, police arrested a Black man with no evidence outside of what some racially biased AI told them.

Nearly two years ago, police responded to a call at a hotel, claiming a man was shoplifting candy from a gift shop. They described him as a Black man, just under 6 feet tall. He provided a Tennessee driver’s license, and apologized, stating he would pay for the candy. It was a fake ID. Upon realizing he’d be arrested, the man ran, jumped in his rental car, and drove off. He struck a parked car and nearly hit an officer in his escape. The next month, using only the photo on the fake ID (which may not have been the suspect’s actual photo), police arrested Nijeer Parks, a man who lives 30 miles away and works in a grocery store.

Mr. Parks was in their system due to prior arrests for drug sales. He did six years in prison, and says it was his wake up call. After he got out in 2016, he got a job, and he has been saving money, with plans to marry his fiancé.

Treated As Guilty Right Away

A mugshot and a photo taken from an ID

Even in the poor photos police released, it’s clear these are two different men, only with similar hairstyles. Now why did police release such poor photos?

 

When Parks found out about his arrest warrant, he went to the police station to clear things up. He had never been to Woodbridge, NJ before, so he didn’t understand how police there could have an arrest warrant in his name. He doesn’t drive, so his cousin drove him to the station. Within moments, he was in handcuffs. Police refused to tell him why, reportedly simply stating, “You know what you did.”

It would be 10 days before Nijeer Parks would be free. A side-by-side of his mugshot and the ID shows just how clear it is that these were two different men. But that wasn’t enough for police, who brought his case to trial.

A judge offered him a plea bargain. If he plead guilty, he may be able to escape a 20-year sentence, but would go back to prison for at least five years, with three years of parole. Parks, knowing he was innocent, used all of his savings to hire an attorney to prove it.

The judge, fortunately, recognized the flimsy case. A Superior Court judge asked the prosecution for more substantial evidence. After a few months, prosecution dropped the charges. However, that doesn’t return the year of stress and intimidation Parks faced, as well as the loss of 10 days while he was in prison and his life savings.

Now he’s suing to make up for it.

Bad AI: All The Evidence They Needed

The NJ.com report initially claimed police used Clearview AI, though now says it’s uncertain what technology police used. A previous report from NJ.com stated that police were asked to stop using Clearview AI shortly after this indecent. Clearview AI is a company that makes facial recognition software. They collect photos from all over the internet, associating them with information where the faces are found. They even collect from Facebook profiles and other photos that should be “safe” behind logins. You’re likely in one of their databases. If not them, certainly one of their competitors. All facial recognition software currently available exhibits racial and gender-based bias. The software isn’t trained properly, it’s built by teams of predominantly white men, and they often don’t consider taking measures to combat bias.

“Hidden” Bias

Many cameras do not appropriately adjust exposure, contrast, or color accuracy for Black people’s skin tones. As a result, photos capture less detail. This, when fed directly into an algorithm, reduces accuracy for Black people.

These teams are unaware of the bias hidden in their training data sets. They don’t realize how much makeup, contouring, or hair changes can make a woman look different to their software, because they don’t take part in these activities. The same applies to voice recognition AI. Women have higher pitched voices and use pitch changes to signal emphasis. Meanwhile, male voices are more monotonous, using volume to punctuate emphasis. Training sets that may account for volume but not pitch may miss the meaning of a request entirely.

Developers of this AI will claim they didn’t make a biased system. They’ll claim that they treated all data equally and used an equal number of training samples from people of all races and genders (though they’ll likely only mean men and women, not non-binary or transgender). The truth is, not all of their inputs were equal. They missed information that could be pulled from their inputs. This leads to bias in AI. By being unaware of systematic bias and bias in other fields, including cameras, leads to bias in the AI. Fortunately, studies have revealed this bias to us.

We know about the bias in AI systems now. We’ve known for years. To continue to use that AI, knowing it’s racist and sexist, for law enforcement is in itself criminal. At this point, using facial recognition in law enforcement is an intentional act of racism.

Trying to Right Wrongs

Parks is suing the police department and the city of Woodbridge, NJ. While New Jersey now bans the use of facial recognition, many states still allow it. His case, if successful, could help convince other police departments to abandon their racist software. If the fact that it carries racist bias isn’t enough, surely the threat of losing money will work.

Mr. Parks and his family were lucky. They were able to come up with the money to hire a lawyer and pursue his case. Another lawyer may have simply told him to accept the plea, sticking him behind bars and under the constant watch of the state for nearly another decade. Not everyone who has faced inaccurate and often racist facial recognition has had the same fortune. Not everyone has been this “lucky.” How many people are behind bars for looking similar to a suspect, according to some AI? Looking at how narrowly these men were able to dodge prison, we can be certain that number isn’t zero.

Our racist AI might not be out arresting people, but police comfortable with that bias are.

“I was locked up for no reason. I’ve seen it happen to other people. I’ve seen it on the news. I just never thought it would happen to me. It was a very scary ordeal.”

– Nijeer Parks


Sources: