Facebook Played Role in Capitol Insurrection, Like Parler. Where’s the Outrage?

Reading Time: 10 minutes.
Tear Gas outside United States Capitol, January 6th, 2021. A Trump flag is in the foreground.

Insurrection at the capitol, 1/6/2021. Photo credit: Tyler Merbler, via Wikipedia, CC 2.0 license.

Before Trump was ever president, his rallies were famous for one thing: violence. Protesters upset with his hate speech and encouragement of violence were treated with hostility. Trump himself invited people to rough up protesters, saying he’d pay their legal bills. From the beginning, Trump riled up a mob. On January 6th, 2021, he gave that mob a target. The United States Capitol building, currently holding the country’s elected officials. Senators, state representatives, and even the Vice President, Mike Pence, were inside that building. Trump, knowing his crowd had turned to violence, told them to march on the Capitol. They did. Even after learning it turned violent, Trump still tweeted out incendiary encouragement.

In the fallout of the Trump insurrection, his followers have been arrested. They could be charged with crimes from criminal trespassing to terrorism, assault to murder, and everything in between. A social network where the alt-right convened went under. No one was willing to support a network used by terrorists to coordinate their attacks. Parler became the pariah of the tech world. A large portion of Parler’s users supported and planned the insurrection. Soon, Parler was just a distant, sour memory.

But what about the other networks guilty of helping Trump’s followers plan and carry out a terrorist attack at the U.S. Capitol? What about those who helped a mob try to overthrow the government? Trump tweeted vitriol, leading to the attack. Insurrectionists used Telegram, a chat app, and Zello, a walkie-talkie app, to communicate. And then there’s Facebook. Perhaps no company besides Parler shares more of the blame than Facebook. Facebook is where many people were radicalized in the first place. It’s where they gathered to plan the attack.

It’s possible the terrorist attack in Trump’s name on January 6th, 2021 would never have happened if not for Facebook.

So why aren’t we tearing Facebook down?

“There’s a Facebook Group for Everyone”

Heavily armed anti-mask protesters at a state capital

Armed conservative protesters rally at the state capitol in Lansing, Michigan on April 15, 2020. Photo: Jeff Kowalsky

 

“All members are in the tunnels under capital seal them in. Turn on gas.”

That’s the message received by Thomas Caldwell, a supposedly “ranking” member of Oath Keepers, a far-right extremist militia group with cells all over the country. They gathered on Facebook under the “Oath Keepers” name, until Facebook banned groups with that name. Now, those same groups simply go by a different name. The Appalachian Oath Keepers Facebook group became the “Appalachian Mountain Patriots” after Facebook instituted the ban.

The name changed, the people stayed. They formed new groups and stayed in contact  they sent each other encouragement, like the message above, telling insurrectionists to kill elected officials.

Birds of a Feather…

Facebook advertised their groups features a few years ago with an ad about people playing the kazoo. Their tagline, “There’s a Facebook group for everyone” was right. They really do have a group for everyone, including violent insurrectionists.

When Facebook cracked down on groups that had a military purpose, they removed groups that mentioned phrases like “boogaloo” and “Oath Keepers,” “3%ers,” and others. However, these are just names. The same people still gathered in groups by a different name. At least 128,000 people used the “#StopTheSteal” hashtag on Facebook, and at least 70 active groups were still using the hashtag according to watchdog organization Media Maters.

“If you took Parler out of the equation, you would still almost certainly have what happened at the Capitol. If you took Facebook out of the equation before that, you would not. To me, when Apple and Google sent their letter to Parler, I was a little bit confused why Facebook didn’t get one.”

– Angelo Carusone, President & CEO of Media Matters

By Any Other Name…

With groups simply changing their names, a shocking 40% of Facebook’s top 10 performing posts on any day between the election and the January 6th attack were from far-right sources trying to undermine the results of the election. An additional 15% of those were from Donald Trump himself, instigating his followers. Over half of Facebook’s most popular posts were undermining democracy in the United States. That lead to an attack at the U.S. Capitol in January.

These were the same people who were in Oath Keepers or Boogaloo groups. Facebook forced the groups to change their names, little else. Even when they’d ban a group, the members would still have access to Facebook. They’d just regroup. It’s like when you lift an old box in your basement and insects crawl out from under it. Sure, they’re not under the box anymore, but they’re still elsewhere in your basement. You didn’t get rid of them, you scattered them. Eventually, they’ll regroup.

Official Sources

It wasn’t just far-right military groups pushing for an insurrection either. Donald Trump undermined the election, which got him impeached and permanently banned from Facebook. Other Republicans even suggested caravans or even bus trips to the Capitol that day. Polk County Republican Party of North Carolina posted this:

“BUS TRIP to DC … #StoptheSteal. If you passions are running hot and you’re intending to respond to the President’s call for his supporters to descend on DC on Jan 6, LISTEN UP!”

They’ve since removed the incendiary post, but nothing’s ever permanently gone from the internet. They used Facebook to get a riled-up crowd to D.C. to stop the vote. It’s obvious how they were supposed to do that. They did it. They stormed the U.S. Capitol. Exactly what their leaders asked them to do over social networks and media appearances.

Facebook wasn’t enforcing rules anywhere. What few rules they did enforce only caused violent insurrectionists to organize under new names. Facebook didn’t get in their way at all.

Great Success!

Over half of all Facebook posts leading up to the violent coup attempt were in favor of undermining democracy in America. That’s exactly what those users tried to do on January 6th. Many of them are still on the platform, in their groups, planning their next “rally.”

This is a great thing for Facebook’s platform. Financially, anyway. Keeping these controversial voices active increases engagement on the platform. By not permanently banning members of these extremist groups, Facebook ensures they can continue to profit from the controversy they stir up. When they cause arguments online, more people stay on Facebook. When they create groups to storm buildings, kidnap politicians, kill civilians and police, they use their groups to organize and their Facebook messenger rooms to coordinate.

There’s a group for everyone, especially if that means you’ll use Facebook more. Radicalized users are the most profitable users.

Radicalized by Suggestion

Facebook logo with red glow on dark backgroundEvery social network has a secret formula. It’s their suggestions. You don’t view Facebook’s news feed in chronological order. It hasn’t been that way in over 10 years. Why’d Facebook remove it? Your friends’ posts aren’t all equally engaging. Some, you might scroll right past. You’ll get bored and leave the site. Others, on the other hand, will keep you reading and commenting, possibly wasting hours of your day on Facebook. All that time, Facebook will gather more data on you to show you more ads, and more relevant ads.

As it turns out, if you join groups or show certain interests, Facebook will suggest other groups. Sometimes, those groups can be extremist. The Tech Transparency Project (TTP) found that their monitoring account, which belonged to militia groups, had suggested categories that were surprisingly benign. Well, for the most part. Facebook labeled the account as having particular “interest categories,” despite not following anything directly related to them. This included pages for Donald Trump and Donald Trump Jr., the Republican Party, “American Football,” and “politics.” This might seem harmless, but the problem goes both ways.

Someone who fits this profile in all ways but the militia memberships could find suggestions for those groups. Meanwhile, members of those groups, like the fake account created by TPP to track these ads and suggestions, are pushed into unrelated groups, where they can radicalize others who already have similar—but perhaps not yet violent—beliefs.

Even Ads

Facebook doesn’t stop at suggesting potentially dangerous groups to these people. The TPP account, meant to simulate a militia member or extremist to see how Facebook handles such users, had some interesting ad suggestions. Right beside stories about the terrorist insurrection at the Capitol, Facebook was helping them gear up for their next riot. Ads for tactical gear adorned these stories. Tactical gear and rifle attachments, right next to stories of insurrection.

Advertisers have a lot of control over who they target. They can choose to target “politically active” far-right users. That’s how these militia members ended up seeing ads for gear to arm them for their next riot. Facebook has done little to stop these companies from advertising on their platform or targeting potentially violent users.

YouTube Too

YouTube is well known for their radicalizing suggestions. Cornell University studied YouTube’s users and suggestions. They found users who consume content over a variety of services simply spent more time on YouTube than other services. They described it as “stickiness.” YouTube sucks their users in better. It also has a long history of suggesting videos that may interest the user but also may “redpill” them, that is, lead them further towards extremist beliefs. A self-help channel could lead to a philosophy channel. That philosophy channel could quickly lead to a video about the history of western civilization. Then it’ll discuss western “virtues.” Soon, users are watching misogynistic or white nationalist YouTubers, having been slowly introduced to them through YouTube’s algorithm. YouTube and Facebook remain the two largest sources of radicalization on the web.

YouTube has removed 1.8 million channels for hate speech. Though they haven’t removed some of their most famous offenders. Still, if YouTube could remove that much hate, and still have a problem with hate speech and radicalization on the platform, how could Facebook, who have removed a small fraction of that, possibly be any better?

Facebook Took Steps

So what steps has Facebook taken? Parler took no steps and found no support from the rest of the tech industry. As a result, it died, and, despite attempts to come back, may never regain legitimacy. But Facebook has still enjoyed almost no criticism. What did they do differently?

Well, something. That alone is more than Parler can claim.

Facebook created a new rule in 2020 to crack down on hate groups and militia groups on the platform. By the end of November, before the riots and height of Trump’s push for #StopTheSteal, Facebook removed:

  • 3,200 pages
  • 18,800 groups
  • 100 events
  • 23,300 Facebook profiles
  • 7,400 Instagram profiles

That’s nothing compared to the millions of channels YouTube shut down, but it’s not nothing. This still represents tens, if not over 100,000 users. Still, there are billions of Facebook users. Hundreds of millions of them are in the United States alone. Hundreds of thousands of users, many of them fake profiles, really isn’t much when you consider that these people were still able to gather on Facebook and plan a largely successful terrorist attack on the U.S. Capitol.

Of course it was successful, the leadership who invited them there faced no repercussions. They terrorized senators and nothing has been done to prevent leaders from organizing another attack. Facebook helped terrorists coordinate an attack, and they’ll do it again if Facebook and legislators don’t act.

Facebook looks like they did a lot to stem extremism on their platform, but it clearly isn’t enough.

Just Not Enough

“By Bullet or Ballot, Restoration of the Republic is Coming.”

– One of the arrested rioters who posted on Facebook. His message wasn’t flagged or removed, despite the threat of violence.

The above quote sat on Facebook well after Facebook removed all the thousands of groups, pages, and profiles. Lot of good that did. Facebook largely relies on AI to remove hateful or potentially violent content. The problem? It misses a lot, especially text within images. The rest is handled by too few Facebook employees who say the job is emotionally taxing.

The fact is, Facebook needs far more content moderators who need more time off between looking at this filth. Facebook also needs to respond to violations of policies, from hate speech to attempted coups, with lifetime bans for all involved users. Facebook may look like they’re doing something, but the scale of the problem on Facebook is much larger than the company is willing to admit. If YouTube could remove millions of channels (not accounts of regular users, but channels), then Facebook surely should also have millions of deactivated accounts. They don’t.

DOJ: Facebook is By Far the Leading Source Cited by Insurrectionists

The Department of Justice (DOJ) put out a report of the people at the Capitol insurgency that have been arrested already. Of 223 individuals cited by the DOJ, 73 referenced Facebook and 20 referenced Instagram. That’s 93 out of 223 people referencing Facebook, as Instagram is part of Facebook. A photo sharing website isn’t the best way to organize a hateful protest, but 20 out of those 223 people used it. In comparison, the second most referenced social network was YouTube, with 24 people. The second most referenced company had 24 people, compared to Facebook’s 93. Clearly, the problem is far larger than Facebook wants us to realize, because then we’d force Facebook to take greater steps to prevent this in the future.

Of those 223 individuals, only 8 referenced Parler. It’s a much smaller website though, with most of the users already having been radicalized. Facebook was a better platform for amassing a crowd, and had more tools for actually planning an attack and coordinating over chat. Parler simply wasn’t a good enough tool for this. The site also did nothing to stop future attacks like this, and that’s the real problem. Facebook at least made it look like they were trying to do something.

Nothing Will Change: This Happened Before

Quote from Donald Trump. "Donald J. Trump is callinf for a total and complete shutdown of Muslims entering the United States until our country's representatives can figure out what is going on."

Facebook allowed Trump to stay, despite his hate speech, like this example, which he originally wrote on Facebook. It spread.

 

I know you must be thinking that this will lead to Facebook stepping up efforts. But it won’t. This is the company who caused a genocide in Myanmar. Genocide. Facebook caused genocide. That still didn’t change Facebook’s enforcement on hate speech. In 2017, the Unite the Right rally was largely planned on Facebook. You may remember this as the white nationalist movement that united KKK members and Nazis. Facebook actually did remove some events and groups… owned by counter protesters. These were people standing against white supremacy.

The Charlottesville riot that started as the “Unite the Right” rally is a memorable one because it also included a terrorist attack. One of those right wing terrorists drove his car through a crowd of counter protesters, killing Heather Heyer and injuring others. Facebook promised changes then, but the largest insurrection of an American institution ever, the only time an enemy of the nation has ever taken our Capitol, was largely planned and coordinated on Facebook and Facebook Messenger.

Nothing changed then, nothing will change now. Around the world, Facebook usage correlates with an increase in hate crimes, with a significant enough correlation that German researchers actually attribute an increase in anti-Muslim hate crimes directly to Facebook. Around the world, Facebook is a source of discord, upheaval, violence, and misery. All in the name of driving up engagement for profits.

Violence is extremely profitable. You sharing cute photos of your “fur baby” isn’t.

Why No Outrage? Because Facebook Did Something

… or at least it looks like it did.

Facebook CEO Mark Zuckerberg arrives to testify before a joint hearing of the US Senate Commerce, Science and Transportation Committee and Senate Judiciary Committee on Capitol Hill, April 10, 2018 in Washington, DC.

Mark Zuckerberg speaking to congress. Photo: JIM WATSON/AFP/Getty Images

 

Parler made no attempts to prevent extremist views on their platform. Facebook, on the other hand, took actions prior to the election and before Trump’s riot. Facebook removed thousands of events, groups, and profiles before the end of November. These accounts and groups were all in violation of a rule Facebook only put in place in August of 2020 banning “militarized social movements.”

Of course, August 2020 is about 10 years too late for such a rule. Why did Facebook need to wait nearly 15 years after its creation to ban violent groups? Facebook has known that violence and extremism have been problems on their platforms for years. They haven’t tried to change because these voices are controversial. Controversy drives user engagement. The more people come back to argue on Facebook, the more profitable Facebook is. Hate fuels the company as much as sharing, connection, or socializing. Maybe even more so.

Facebook did the bare minimum to stay in business and direct blame to Parler and Gab. While most insurrectionists may have used Facebook to plan their attack and communicate with other terrorists, Facebook can assure us that at least they’re trying to squash the problem. But despite their efforts, Facebook remains the leading site for violence by a wide margin, according to the DOJ.

To believe Facebook is to believe that humanity is unimaginably bigoted, violent, and hateful. That the problem is just too large for them to handle, and that they’re doing the best they can. Taking Facebook’s word means believing that there are absolutely no measures great enough that Facebook can do to stem the tide of hatred that is humanity’s true face. The few moderators Facebook has hired and the AI built by people who have never experienced online harassment for their gender, race, sexuality, or religion is simply the best we can do. Maybe they’re right. Maybe humanity truly is lost. However, considering how little Facebook has done to permanently ban these bad actors, I think Facebook’s full of shit.


Sources:
,