Facebook Played a Key Role in Rohingya Genocide

Reading Time: 7 minutes.
Rohingya refugees seeking shelter in Bangladesh, crossing water..

Rohingya refugees seeking shelter in Bangladesh. Photo: Sergey Ponomarev for the New York Times

In the United States, fake news spread on Facebook may have lead to the election of Donald Trump. In Germany, anti-refugee posts on FaceBook lead to an increase in attacks on refugees. However, Facebook’s inaction in Myanmar contributed to one of humanity’s darkest acts. By refusing to take quick action on violent posts and the spread of fake news, Facebook played a part in what the UN is calling genocide.

Facebook accepted its role in the spread of fake news, xenophobia, Islamophobia, and attacks on a country’s most vulnerable citizens. However, they’re not doing enough, and, for thousands of lives lost and hundreds of thousands of lives affected, it’s too late.

However, reports indicate that Facebook may prefer to allow hate speech to flourish on the site because it drives profits.

A Brief Explanation of Rohingya in Myanmar

Satellite footage of Rohingya villages burning.

The Rohingya are Muslims who called Rakhine State, Myanmar their home. Due to their religion and heritage, Myanmar, a primarily Buddhist nation, denied them citizenship with a law passed in 1982. However, the Rohingya continued to live in the land they were born in. Over 1 million Rohingya people lived in the Rakhine State of Myanmar. Now, mobs and military forces have driven 700,000 Rohingya out of Myanmar, and countless more are dead.

Rohingya muslims living in Bangladesh

Photo: Reuters

How did this happen? There were decades of dehumanization of a group of people the UN refers to as one of the most persecuted minority groups in the world. After 3,000 Rohingyas were attacked and killed, 12 Myanmar soldiers were killed by Rohingya rebels. This lead to the Myanmar military, burning villages, murdering their people, and raping young women. They were joined by mobs of Buddhist Myanmar citizens. Many Rohingya died on their march out of Myanmar into Bangladesh, either from their injuries or drowning in the river. Not one Rohingya life has been spared extreme hardship.

A Survivor’s Account

Hundreds of Rohingya packed under small makeshift tents, seeking food and shelter.

Photo: Reuters

Below, I’ve embedded a segment from The Daily, a NY Times podcast. Fast forward to the 9:09 mark for the segment on a Rohingya woman who survived brutal treatment at the hands of the Myanmar army.

However, I urge great caution. This is a story that is truly horrifying. Do not listen to it if you are extremely sensitive to content involving abuse, rape, and murder. I share it only because people should understand what hate speech does.

https://radiopublic.com/TheDaily:RXhwbG9yZS0x/ep/s1!2fd3e

The UN released a report stating that military leaders will face genocide charges and charges for crimes against humanity.

 

Facebook’s Role in Violence

Facebook like thumb holding a molitov cocktailYou’re likely thinking, what could Facebook possibly have to do with these atrocities? This has been the result of many years of hate. However, Facebook has made the situation worse, not better. Facebook was created to bring people together. Instead, it’s tearing us apart.

This is what hate speech does. Hate speech dehumanizes a group of people. It makes fearful, xenophobic people more likely to lash out against a group of people they see as “others.” In the U.S., it tells people to fear Muslims, Mexican immigrants, and LGBTQ people. In Myanmar, it told people that Rohingya were less than human, that they were animals, savages, murderers, and rapists. Myanmar’s people turned a blind eye to the atrocities committed by its military because they believed it to be just. Many joined in on attacks. This extremism is born from the spreading of hate speech, and Facebook was a key platform for that hate speech.

The UN specifically mentioned Facebook while calling for charges against the Myanmar military:

“The role of social media is significant. Facebook has been a useful instrument for those seeking to spread hate, in a context where for most users Facebook is the Internet. …The extent to which Facebook posts and messages have led to real-world discrimination and violence must be independently and thoroughly examined.”

A Long History of Ignorance

Aela Callan, a documentarian, warned Facebook in November of 2013 about hate speech on the platform. She indicated the perilous nature of Facebook’s fake news and hate speech problem. Unfortunately, Callan discovered that Facebook didn’t consider hate speech a danger in 2013.

In 2014, a false story went viral about a Muslim shopkeeper raping his Buddhist employee. This lead to an angry mob and a riot. Many were injured, rioters burglarized businesses, and they burned buildings. In Myanmar, Facebook is the news for many people. They trust the word of mouth from the people they follow. Many communities around the world are no different.

In response to the ongoing conflict, Facebook promised to speed up their translation of their guidelines into Burmese. However, this translation took 14 months.

Hate’s Place on Facebook

Posts accusing Muslims of murder, rape, and bestiality spread like cancer in Myanmar. Individuals are attacked, their lives often uprooted due to photoshopped images and accompanying text. Islamophobes will often use memes to spread and soften the blow of their hate speech as well. This isn’t unlike what I’ve personally seen from U.S. Facebook users.

The Myanmar army is also using Facebook to share their own narratives, using the platform to distort truth in the country. They’re selling the idea that the Rohingya have burned their own homes and fled, a story that many find hard to believe, despite an addiction for fake news. However, people are comforted when they’re told their suspicions are correct. A story that supports prejudice spreads rapidly.

The Facebook Effect

Facebook logo made up of people displaced in Myanmar

From Reuter’s report on hate on Facebook. Photo: Reuters/Soe Zeya Tun

It might be easy for some to deny the link between hate speech on Facebook and hateful acts of violence. After all, people have been arguing on the internet for decades. Toxic users utter threats online with no desire to follow through on them. How could a meme on Facebook lead to riots and genocide?

Moving our focus from the Middle East for a moment, let’s look to Europe, specifically Germany. A recent study by Karsten Müller and Carlo Schwarz of The University of Warwick entitled Fanning the Flames of Hate: Social Media and Hate Crime sought out to isolate the effects of social media, specifically Facebook. They were successful in finding a connection: Facebook interactions lead to an increase in hate crimes. Facebook use was the top predictor of violent acts.

By looking at Facebook interactions, internet and Facebook outages, along with every hate crime committed against Muslim refugees in a selected region during those time periods, the researchers were able to prove a link between hate crimes and social media posts. It shouldn’t be a shocking revelation: hate speech leads to hate crimes. The researchers came to a startling conclusion: Facebook use in general lead to higher rates of hate crimes. Facebook is exposing people who wouldn’t otherwise be driven to commit a crime to violent content. When Facebook goes down, hate crimes go down.

Facebook’s Reluctant Efforts

Facebook reactions warped. Thumbs down, broken heart, screaming, shock, sad, a gun emoji, then angry. A scene of nightmares, too real for too many.

Facebook’s hidden reactions

Facebook has been excruciatingly slow to change. People’s lives are on the line; this should be Facebook’s only priority right now. Not cute navigation icons, not suggestions in Messenger, not reactions, but the elimination of hate speech on their platform. Facebook has been an accomplice in violence, and they need to do better.

Facebook banned 18 Facebook accounts, one Instagram account, and deleted 52 Facebook Pages that were spreading hate on the platform. They will also increase the number of Burmese language moderators from 60 to 100 by the year’s end. There are over 18 million Facebook users in Myanmar, with 12 million following recently banned hate groups. 100 people moderating those messages for hate speech will not be sufficient.

The Difficulty of Eliminating Hate

Many angry reactions from Facebook's angry react animated emojiHate is the majority in Myanmar. In the U.S., nearly half of America voted for Trump despite hateful rhetoric. In Germany, many support the far-right Alternative für Deutschland party and their attacks against refugees. Hate spreads because it is attached to fear. Fear that someone is coming for your family, your job, or your health can make people search for their “tribe.” They find that tribe in highly active racist ideological bubbles on social networks, never realizing how hateful and radicalized they’ve become in a hate-filled bubble.

When hate is seen as a legitimate political platform, critics call efforts to stop it censorship. In the U.S., we have a president who put white supremacists on the same level as the people protesting against that hate. Hate speech is not like any other speech. Its purpose is to dehumanize and incite violence.

Volunteers are Few

A man holds up a poster from team Honey Badger. He notes that "It's fulfilling to know we are silent heroes in the everyday life of users."

Project Honeybadger: an operation contracted by Facebook to stop hate speech

With violent people believing that anti-hate speech measures are attacks on their free speech, they attack the so-called censors. Facebook hires local contractors to help them remove hate speech and fake news from the platform. These people paint a bleak picture though. They’re unable to keep up with the sheer volume of hate, and Facebook pushes them towards exclusively removing the worst content. Despite this, the far-right considers them evil censors. Their vitriolic attacks aren’t without teeth either. Employees tasked to reduce hateful attacks are doxxed and attacked themselves. Doxxing can have deadly consequences. Victims of doxxing have been “Swatted,” harassed as they go about their day, and sent suspicious packages. Being a Facebook fact checker is neither rewarding nor safe.

Algorithms for Money, not Peace

There’s another problem, and it has to do with how Facebook’s business operates. Facebook relies on an actively engaged user base. Profits are directly tied to how much time you spend on the service, they can learn more about you for advertisers and also sell more ad space.

The News Feed used to be in chronological order, but Facebook discovered that, by serving you up content you’re more likely to engage with, they can keep you on their site longer. To do this, they give you things you’ll like, of course, but also posts that stir up controversy. Fake news and hate speech keep you on the site longer, either because you agree with it, or because you want to correct the falsehoods in it. Facebook is profiting from hate speech and fake news. That’s why they haven’t worked hard to eliminate it. For Facebook, as long as they can’t directly be blamed for violence, the hate speech is acceptable.

Our Part

report hate speech on facebookWhen I’m done writing this article, I’m going to share it on Twitter, Google+, Tumblr, and… Facebook. I get many of my views from Facebook. People see my site and follow it on Facebook. In 2018, it’s almost impossible to run any kind of business without Facebook.

However, that doesn’t mean we’re powerless. We can report posts by clicking the three dots button on a post and sending feedback, along with a report. If it’s a friend who posted it, you can also leave them a comment, something like, “This is hate speech. I really thought you were better than this.” Don’t chicken out and unfriend or unfollow them. People who feel attacked online don’t retreat, they become more firm in their stance so be polite and caring, giving them space to improve.

Force Facebook’s Hand

Facebook thumbs downWe can also demand more of Facebook. Share articles and tell people what Facebook is doing. Perhaps take part in a Facebook boycott day. A day of terrible profits and reduced activity sends a huge message to a company, and has worked to bring attention to issues on Twitter in the past.

Our politicians must do more. We can demand legislative action holding Facebook liable for hate rallies like the “Unite the Right” rally that lead to a alt-right lunatic killing Heather Heyer. We can hold Facebook responsible for causing riots in Myanmar, or forcing journalists out of their homes. If there was a financial impact for their inaction on hate speech and fake news, Facebook would act. But Facebook will not do anything until hate speech is no longer profitable.

Pop Radicalized Bubbles

Be an influencer that fights hate. If you speak out against hate, you can pop the radicalized bubbles on the internet. When people go online and only see hateful comments and fake news, they believe it’s normal and acceptable. You can shatter that image by speaking out against hate. For too long, we’ve been quiet, but those spreading hate speech and fake news have been louder than ever. It’s our turn to speak up.


Sources:

,