Leaf&Core

YouTube Knew it was Suggesting Toxic Videos to Users and Children. They Continued Anyway

Reading Time: 5 minutes.
YouTube logo, dark background seeping into the logo.

YouTube has a way of trapping people on their site. What they do with them is decided by algorithms YouTube has refused to rein in.

 

There’s a problem on YouTube. It has the power to radicalize a person. They might start out with one video, perhaps a news report on a white supremacist attack. Maybe they found a story about a shooter. Soon after, they’re getting suggested videos from misogynists like Jordan Peterson. Then, they’ll go deeper, finding other white supremacists, misogynists, homophobes, transphobes, Neo-Nazis, and other xenophobes. YouTube will just keep suggesting videos to them, introducing them to more and more radical ideas.

It doesn’t just work on the far-right. YouTube only recently stopped showing anti-vax videos, because this process was leading parents to believe the anti-science nonsense. YouTube directly assisted in creating the environment where measles could make a comeback, and other diseases will follow.

Down the Rabbit Hole

It goes for any topic. Watch a flat earth video and suddenly YouTube is trying to convince you that the earth is flat. You won’t get opposing viewpoints. For example, if you’ve watched one misogynistic video, you won’t get the input from women. You won’t hear both sides.

Apparently, this is exactly how Google’s James Damore, the man who published a sexist screed at Google, was radicalized. He found one source, then another, then another, and soon, Peterson and other internet sexists had him convinced that women didn’t belong in the workplace. YouTube, Facebook, Google, and Reddit had crafted a misogynistic echo chamber for him in the name of engagement.

He never did what a good scientist would do and try to find opposing research. If he did, he would have realized his ideas have long since been debunked. But YouTube makes it easy to dive down these rabbit holes. They make you believe you’ve learned everything, when you’ve only gotten one side. They take advantage of vulnerable people and introduce them to empowerment through cults, racism, misogyny, and hate.

As it turns out, a report from within YouTube states that executives at the company knew they were radicalizing users and putting children at risk. They continued to allow it anyway. Why? It’s profitable.

Hate is Profitable

Spoiler alert: It’s nothing but anti-science, anti-logic hate speech.

YouTube is full of hate videos. If you’ve got a new YouTube account, or haven’t encountered them before, you may have to search for them. After you do, you may find other hate video suggestions. Searching for a particular Jordan Peterson interview for empowerment will likely lead to his other lectures. Soon, you’ll find some of Ben Shapiro’s hateful rants. Eventually you’re onto white supremacist videos or those from other hate groups. Worst of all, YouTube often doesn’t flag misogynistic or anti-LGBTQ videos as hateful, often considering videos that attack transgender people or women to be “political views,” while obvious hate for other groups is flagged (as it should be, of course).

The problem is, the rabbit hole is extremely profitable for YouTube. By flagging as few videos as possible, they can continue to show ads on those videos. By suggesting more videos with similar topics, they can keep you stuck on YouTube for hours, all while overlaying ads on hateful content.

Search Bias Leads to Suggestions Leads to Addiction

For some time, anti-LGBTQ videos had more advertisers than LGBTQ people just talking about their lives or journeys. This is because YouTube still moderates LGBTQ content more than any other acceptable content. Perhaps that’s why hate videos targeting these people aren’t taken as seriously within YouTube.

You’ll get locked into one point of view, going from search engine term bias into false statements and videos made to deceive or inspire hate. That search bias, searching for terms that confirm existing biases, quickly spirals out of control.

YouTube says they’re working to stop suggesting controversial content now. However, a few quick searches for some well known spreaders of hate speech turned up many videos that YouTube did not flag for hateful content. How long has YouTube known this was a problem, and why are they still not doing enough to fight it?

YouTube’s Knowledge and Reaction

Yonatan Zunger left Google (who owns YouTube) in 2016. He had raised an issue at the company. YouTube was suggesting videos full of hateful and toxic content to its users, and many of those users are minors. He had a simple suggestion: flag inappropriate content and stop suggesting that troubling content to users.

Zunger wasn’t even advocating for the removal of these items (which most people would). Instead, he simply wanted to stop suggesting hate speech, violent videos, and other troubling content to YouTube’s often young users.

He saw that many rule breakers found where the line was and got as close to it as they could without breaking YouTube’s rules. They were still spreading toxicity and hate. So, he proposed a new category: problematic videos that don’t break the rules, but still shouldn’t be suggested to users. Videos like Breitbart’s “Black Crime” episode series, or fake news from InfoWars. Things that drive hate and radicalize viewers by presenting only one extreme side of any debate. Zunger and others at YouTube wanted to put a stop to the practice of suggesting these videos to users.

YouTube said no.

That was three years ago. Now, after backlash from YouTube’s community, politicians, and other social network users, YouTube’s taking action. While that action isn’t dramatic, it’s a small step in the right direction.

YouTube: Behind its Time

YouTube’s taking an idea that was the bare minimum in 2016 and applying it to our hate-filled world in 2019. After giving a platform for hate and fake news, leading to anti-vaccine mentalities, flat earth conspiracies, conspiracies that lead to shootings, false statements that lead to changed election results, and hateful content that leads to violence and lessened civil rights, YouTube’s doing something… incredibly insignificant.

They’re lessening the impact of the rabbit hole by removing flagged videos from suggestions. But they’re still not flagging much of their hateful content, and they’re not doing anything to stop search engine bias. They’re still helping people find hateful and toxic content that can radicalize their views. YouTube is still helping people spread hate, encourage acts of violence, and fuel epidemics.

A Billion or Bust

YouTube, a week apart. On the left, ads and no fact checking. On the right, YouTube as it is now.

You can go to YouTube and find videos calling for the removal of basic human rights from groups of people. You can find videos of hateful people spewing lies and falsehoods to stir up hate. As long as you can find those videos, watch them, and give YouTube their views, YouTube isn’t going to try to stop the flow of toxicity they’re pumping into your home.

Views are profit. If YouTube can keep you diving down the rabbit hole, they’re going to do it. In some cases, it’s a good thing. Maybe you’re getting legitimate news, following tech reviews (hey!), or trying to relax with some calming ASMR videos. Often, that rabbit hole is a good thing.

But it’s the hateful posts that will, ultimately, damage YouTube’s brand. It took a long time for YouTube to become profitable. If they’re not willing to coast for a little, build up a reliable userbase, and create a safe environment for browsing, they’re going to throw users to Vimeo and other video sites.

Slow Down

To YouTube’s credit, after pulling up a video with extremist alt-right ideology, YouTube did recommend this video about escaping that hateful thought process. YouTube’s improving.

Basically, YouTube was like someone who could see the finish line, so they put everything they could into making it those few extra meters. They even accepted views from hateful sources. Unfortunately, that finish line is further away than they realized, and they’re going to pass out before crossing the line. YouTube, in their rush for profits, is damaging their brand and ability to continue making money.

YouTube knew their platform was becoming increasingly toxic and radicalizing viewers. But it was profitable, so they did nothing to change that. Had the proceeded with caution and moderation (in more ways than one), they wouldn’t be in this mess.

In fact, maybe we wouldn’t be in these messes either. Anti-vax, flat earth, crisis actors, Trump? YouTube could have spared us of this. They didn’t. YouTube is finally catching up to its own mistakes. But is it too late for them? Is it too late for us?


Sources:
Exit mobile version