Leaf&Core

Facebook’s Intentional Lawlessness Leads to Extremism… and Profits

Reading Time: 5 minutes.

Facebook on an iPhone with a dark background.Facebook has removed barriers for conservative commentators in the past. They even maintain a list of “VIPs” that don’t have to fully play by Facebook’s rules. Their goal is to drive controversy and engagement. Facebook long ago found that by fostering social bubbles and then forcing conflict between them, they can drive engagement through the roof. Engagement means more ads seen and greater profits.

According to a whistleblower, Facebook has taken it too far. She claims that Facebook knows about the polarization they’re causing and allows it because that’s more profitable than making the algorithms safer.

Leading up to Trump’s coup attempt, insurrectionists gathered on Facebook and Parler, as well as other websites, to plan their attack. If Facebook knows how far they’re pushing their users into extremism, and does nothing, that’s criminal negligence. It’s no little matter.

For Facebook, radicalization is profitable, keeping people safe isn’t.

Blowing the Whistle on Facebook’s Hate Incubators

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook.”

– Frances Haugen, Facebook whistleblower

The whistleblower who lead to the incredible reporting from The Wall Street Journal has come forward to speak with 60 Minutes. Frances Haugen’s leaks lead to reporting that Facebook maintains VIP lists for users who can break their rules, Facebook knows that Instragram has a negative effect on teen girls, Facebook knows tweaks to their algorithm have lead to more anger on the platform, their reluctance to act quickly against human trafficking and drug cartels, that the platform itself has worked against vaccination, and so much more. Her releases have been the best look we’ve had behind Facebook’s impenetrable walls. Perhaps that’s because, as the product manager at Facebook’s Civic Integrity group, she saw more of the dangers of Facebook than even most employees ever see.

“Paying for its Profits With Our Safety”

Haugen stated that Facebook’s “paying for its profits with our safety” during the 60 minutes report. She says she doesn’t “trust that they’re willing to invest what actually needs to be invested to keep Facebook from being dangerous.” Facebook’s algorithms drive interaction with the platform, even if that means showing things that will incite users to hate and violence. Misinformation flourishes in such an environment. The sad truth is, the dangers Facebook imposes on public safety are quite profitable. If Facebook isn’t sure they can make a profit without causing a collapse of civilization, they won’t make any efforts to fix their problems.

To Facebook, it’s only a problem if it gets in the way of the bottom line.

Hate Untethered

After the election, Facebook broke up their Civil Integrity Unit. This group within Facebook was responsible for nudging Facebook towards some accountability and responsibility during the elections. Of course, it failed miserably. Politicians were allowed to lie on the network, while others around the world were allowed to share false stories and propaganda. Some of that, like misinformation about COVID-19, lead to deaths.

Facebook specifically broke up the group before the January 6th Trump-fueled terrorist attack at the U.S. Capitol. This lead to groups existing on Facebook specifically to plan this attack. Facebook only banned some of those groups months after the attack, when investigative reports from journalists, like those at Snopes, became public.

As it turns out, Facebook turns a blind eye to hate speech and incitement of violence. This is, in part, intentional. These posts rile up users, and angry users interact with Facebook more than level heads. Along with the hateful, people who are the targets of hate speech—or allies of victims—are more likely to respond to it and keep users returning to a post than if the hateful post is simply deleted.

According to Haugen, Facebook knows they block only 3-5% of hate speech on the network. That hate speech has been shown to directly correlate to real-world violence. A study of German Facebook access and hate crimes proved that hate crime attacks drop off when Facebook has an outage. A U.N. report cited Facebook as the leading carrier of the hateful messages that lead to the genocide of the Rohingya in Myanmar. When it comes to calls for violence, Facebook blocks less than 1% of the posts. This is how hate spreads from Facebook into real-world attacks, like the January 6th insurrection and the genocide in Myanmar.

Anger Fuels Facebook

Facebook’s algorithm is tailored to boost posts that make users angry. Haugen states that you may scroll through 100 pieces of content on your page, but Facebook had a choice of 1,000 different pieces of content they could have showed you. They choose the content you’re most likely to interact with, what is most likely to keep you on the page. In most cases, this is content that makes users angry or hateful, and keeps them on the site to either combat hate or spread it. Facebook knows this and could choose less incendiary content that still drives usage, but it would cause users to spend slightly less time on Facebook, improving moods, and reducing profits. Facebook knows they have a problem, they just don’t want to sacrifice infinitely growing profits to stop it.

It has gotten so bad that Facebook cannot be trusted for rational or level-headed advertising. European politicians were surprised to find that, if they wanted their campaign ads to spread organically, they had to go negative. Those hateful and anger-inducing campaign ads that attacked their competition rather than proposing solutions spread far more than more rational advertising. The result is an increasingly polarized electorate. That can lead to civil unrest over elections.

Sound familiar?

Destroying Teens’ Mental Health

Part of Haugen’s earlier leaks showed that Facebook executives know Instagram is damaging to teen girls’ mental health. According to internal documents, which Haugen leaked for us, 13.5% of teen girls say Instagram makes their thoughts of suicide worse, while 17% said it makes their eating disorders worse. This is because Facebook shoves content that is often manipulated heavily to make women’s bodies look like impossibly thin things. These posts get a lot of attention, so Facebook shows them to users a lot. The problem is those bodies are made in Photoshop, not in a gym or the kitchen. They’re impossible beauty standards that make girls feel worse about their bodies.

But, once again, it’s profitable, so Facebook hasn’t done anything to stop showing these or to take down obviously manipulated images. To Facebook, the small boost in profits is always worth whatever the cost.

Stop. Using. Facebook.

I wish it was as easy as just deciding not to use Facebook. I do. Facebook may be the only way you keep in contact with certain people, especially older generations. Still, you can limit your usage of Facebook to only when you need to directly reach out to others. Managing a business can be tough. You can try to push your users over to another platform, but many won’t go and you won’t be able to get the same number of users or followers on other networks. Still, please, if you use other social networks, like Twitter or Tumblr, please consider following Leaf and Core on one of those instead.

The best thing you can do is just stop scrolling through the news feed. If you must, consider these tips to help you continue using Facebook without allowing Facebook to profit from your use of the platform. Facebook has interwoven itself into our lives. That makes it hard to quit. But you can still send a message to the company. People’s lives are more important than profits.


Oh, and you did stop using Facebook today, didn’t you?

Maybe try to make a habit out of that?


Sources:
Exit mobile version