Facebook Admits but Downplays Role in Rohingya Genocide

Reading Time: 9 minutes.
Rohingya refugees seeking shelter in Bangladesh, crossing water.

Rohingya refugees seeking shelter in Bangladesh. Photo: Sergey Ponomarev/New York Times

Government forces and civilians pushed Rohingya Muslims out of their home in Myanmar. They killed people, burned babies, gang raped women, and beat children. The UN report on the abhorrent violence ruled it as genocide.

How could people do such terrible things to someone else? How could people treat other human beings with such disdain, such cruelty, such bigotry? They learned it from Facebook.

“The role of social media is significant. Facebook has been a useful instrument for those seeking to spread hate, in a context where for most users Facebook is the Internet. …The extent to which Facebook posts and messages have led to real-world discrimination and violence must be independently and thoroughly examined.”

– From the UN’s Report on the violence in Myanmar, emphasis added

The UN explicitly called out Facebook in their report. The platform allowed fake news, hate speech, and calls to organize violent demonstrations. Worldwide, we’ve seen the same issues, but to a much lesser extent. Facebook has attempted to stem the nationalistic hate speech so prevalent in Europe and the U.S., with minimal, but significant levels of success. But in Myanmar, Facebook, for a multitude of reasons, turned a blind eye. As a result, Facebook had a direct hand in genocide.

Facebook reactions warped. Thumbs down, broken heart, screaming, shock, sad, a gun emoji, then angry. A scene of nightmares, too real for too many.

Facebook’s hidden reactions

Facebook released an independent assessment of their actions, which found that the spread of violence in Myanmar was largely due to Facebook’s inaction, and Facebook has a lot to do to minimize their negative impact. Despite this, Facebook has not promised the sweeping changes necessary to squash hate speech.

Facebook allowed Islamophobia and nationalism to thrive in Myanmar, issues prevalent in the U.S. and Europe right now as well. Despite the resulting violence, Facebook still is unwilling to take the necessary effort to put a stop to this in Myanmar or anywhere else.

The Report Activists Wanted

Satellite footage of Rohingya villages burning.

Satellite footage of Rohingya villages burning.

Activists and even the United Nations demanded culpability from Facebook. They wanted an independent worldwide public audit, as well as promises from Facebook to maintain transparency and enforce the same rules worldwide. In true corporate fashion, Facebook has done none of these things. However, they did meet us halfway on the report. Facebook requested a report from independent nonprofit organization Business for Social Responsibility. It was not the worldwide assessment we demanded, but it was a detailed look into Facebook’s wrongdoings in Myanmar.

Results of the Report

Cover page for Facebook in Myanmar report from BSR. Shows the Myanmar flag in the center.

Myanmar is in a state of upheaval. The country’s leadership is at odds with its military. The military is spreading hate speech, yes, on Facebook, but through a variety of other means as well. There were already deep racial and religious tensions in the country. Freedom of speech is new in the country, and proper, real news sources were not as quick to pop up when Myanmar citizens gained some free speech rights, resulting in few trusted sources for news.

Basically put, Myanmar was destined for a fake news epidemic. Facebook just so happened to be the platform it was carried out on.

The report is clear to point out that this does not absolve Facebook of guilt. Had the company taken some very basic precautions, the platform would not have become the haven for hate speech that it became. If Facebook took fake news and hate speech seriously, they could have put a stop to it. Would it have been enough to stop genocide in Myanmar? We’ll never know. But it could have been a source of real news, helping people throw away prejudice and hate. Instead, it fueled hate. That, without a doubt, made a huge difference.

People died because of Facebook’s inaction.

The Report’s Suggestions

A spreadsheet showing the changes BSR believes Facebook needs to make. Detailed below.

The changes BSR requested.

The report makes suggestions on how Facebook can improve. They should create a stand alone human rights policy, one that balances free speech and protections from hate speech. Hate speech can cause a form of self-censorship, to avoid violence. This has already happened on Facebook and Twitter. As such, allowing hate speech goes against protecting free speech. Facebook also needs clear definitions of hate speech and terrorism, which should (and don’t) align with international standards.

BSR’s report also demands a larger team at Facebook. Facebook needs a cross-functional team based in Myanmar, who can have a deeper understanding of the cultural and linguistic issues in the country. Facebook doesn’t fully understand hate speech, let alone the specific types found in Myanmar. They need to do better research and understand this.

They also ask for transparency to go along with system-wide changes. Facebook will become more popular in the country, and WhatsApp and Messenger could become tools used to spread hate as well. The company needs to prepare for that. Facebook needs to make plans for the future, and ensure that the public knows of these plans and can monitor or adjust Facebook’s outlook. A feedback loop keeps Facebook honest, but it also ensures they’re doing the right thing.

Finally, Facebook needs to work to increase journalistic and technological literacy in the country. In Myanmar, “Facebook is the internet.” People don’t know about fake news, or that some people are spreading hate speech for political and financial gains. They trust the news they get from Facebook, instead of trusting journalistic standards of established organizations. Because Facebook has not worked hard enough to explain its platform and rid itself of hate speech and fake news, Myanmar citizens had no clue that hateful members of their society had manipulated them.

The Worldwide Problem

Men carrying American flags, Nazi flags, and the "Confederate Flag" in Charlottesville, VA.

U.S. Nationalists at the Unite the Right Rally. Photo: Edu Bayer/New York Times

These are problems unique to Myanmar, but they reflect issues found all over the globe. In Germany, Facebook outages are directly related to a reduction in hate crimes. When Facebook’s down, fewer people are attacked. Analysts believe that hate speech and fake news that spread on Facebook before the 2016 election lead to voters selecting Donald Trump for president. Russia was able to influence the American election through fake news and ads purchased on Facebook. All over the world, fear, nationalism, and hatred is spreading, and Facebook is a willing participant in all of it.

To Facebook, these are things that drive usage on the platform. Facebook profits from hate speech. They make money when politicians preach Islamophobia or when your racist aunt posts a derogatory meme. Facebook benefits from homophobia and the dehumanization of transgender people. Why would they want to stop it?

Nationalism, hate speech, Islamophobia, homophobia, transphobia, racism, and xenophobia are becoming core beliefs of political platforms. They’re not just beliefs of small hate groups wearing hoods and burning crosses. They’re becoming the beliefs of your neighbors. Hate is spreading like a cancer through our planet. Facebook, Twitter, and even our mainstream news organizations are not doing enough to combat it.

Thanks to companies like Facebook and Twitter, hate has never spread so quickly. They managed to make a business platform that directly benefits from hate speech. Of course they allowed it to spread.

Facebook’s Promised Actions

Facebook promised to hire 100 native Myanmar language speakers by the end of 2018 to moderate posts. That’s just 100 people to cover upwards of 18 million Facebook users in Myanmar. People in Myanmar get their news from Facebook almost exclusively. When Facebook took down notable hate speech pages, they found those pages had nearly 12 million followers. Approximately 66% of Myanmar Facebook users actively followed or engaged in hate speech on the platform. 100 people will not be capable of handling that.

Facebook also points to a different number. They state that, with their own automated algorithms, they’ve been able to identify hate speech in 63% of cases where they were confirmed manually. That’s actually not very much, but it is a remarkable improvement over the 13% they caught with algorithms late in 2017.

Difficulties for Facebook include a dearth of developer literacy when it comes to the native Myanmar languages. Few of Facebook’s developers can write algorithms against the languages spoken in Myanmar. In fact, Zawgyi, a language spoken in Myanmar, has no Unicode characters. Modern computers simply can’t encode the text. While Facebook previously allowed posts in Zawgyi, they’re blocking it in the future, as they simply can’t run these posts through their algorithms yet.

It’s easier to list what Facebook should do, but has not promised to do.

Where Facebook Falls Short

Facebook like thumb holding a burning molotov cocktail. Facebook failed to follow the BSR report closely. In doing so, they failed to take the genocide in Myanmar seriously, as well as their role in it.

No Local Office, Few Employees

Facebook will not put a local office in Myanmar. It would be silly if they would though, because they’re only hiring 100 people to look after the 18+ million Facebook users in the country. This is not enough, considering 12 million people followed the few hate speech accounts that Facebook removed. Facebook should maintain a local presence in the country for a few reasons. First, it increases visibility of Facebook’s efforts and gives the company legitimacy. Secondly, it gives Myanmar citizens employment opportunities. Finally, a physical office is important for social interaction, planning, and cross-pollination of ideas. That’s why, even in industries where working from home is prevalent (like software engineering), we still spend a considerable amount of time in the office.

Zawgyi and Encoding

Facebook no longer allows new accounts to sign up with Zawgyi text encoding for the Burmese language.This is a good first step, but it’s not everything they should be doing here. Zawgyi can’t be interpreted by machine learning, and therefore requires a manual look. This drastically limits Facebook’s ability to flag content quickly, before it has a chance to spread. Even by Facebook’s own admission, their AI is now flagging objectionable content 63% of the time before a human. That’s not enough.

Facebook should completely remove Zawgyi from the platform. Without creating an encoding solution to translate these characters to Unicode, an unreasonably difficult task, Facebook is left with one option: remove Zawgyi. This would potentially reduce usage on their platform. Facebook is not willing to sacrifice profits to stop hate speech or genocide, unfortunately.

Human Rights and Terrorism Policies

BSR specifically mentioned that Facebook needs to create their own internal human rights policies and a definition of terrorism that conforms to international standards. Facebook tried to side step this saying that they have policies in each country that reflect the needs of that country, but this is exactly the stance that BSR has taken issue with. Without a global policy, Facebook can hold some countries to a lower standard than others. Hate speech is not something that should be accepted based on geographical location.

WhatsApp

WhatsApp app and accompanying quote: "WhatsApp’s design makes it easy to spread false information. Many messages are shared in groups, and when they are forwarded, there is no indication of their origin. The kidnap warnings have often appeared to come from friends and family."

Quote: Vindu Goel, Suhasini Raj and Priyadarshini Ravichandran/NY Times

WhatsApp is Facebook’s popular chat app. In India, people have been murdered based on fake news that spread through the chat platform. Because the chats are private and often encrypted, they’re difficult—but not impossible—to monitor. People have already died from a lack of action on Facebook’s part. If Facebook begins cracking down on hate speech on Facebook in Myanmar, it could spread to Facebook Messenger or WhatsApp. Facebook has promised no action on WhatsApp. Local authorities struggle to stop rumors and false news from spreading on the platform. They ask citizens to be mindful of what they read online. But their efforts are in vain.

Despite these communities and BSR asking Facebook to do something about WhatsApp, Facebook has done nothing and has promised no action. WhatsApp labels forwarded messages and, in India, Facebook took out advertisements warning users to think about what they read online. But they could do more. They could check messages before they’re sent, they could promise no encryption for received forwarded messages and read the contents if the user attempts to forward them again. They could put popups in the app. Furthermore, they should spread the efforts made in India to Myanmar, where hate speech spread over the chat app will soon become common.

No Transparency Plans

Facebook has made no promise to continue being transparent. The company still will not promise that they will follow up with progress reports at regular intervals. Unless something large happens again, this is likely the last we’ll hear from Facebook on the issue.

No Community Outreach or Fact Checking

Chart showing fake news overtaking real news leading up to the U.S. election in 2016

Fake news outperformed real news on Facebook

One of the largest advertising platforms in the world is Facebook. Still, the company has not committed any ad space on their platform towards reminding users about the dangers of fake news. This could be an easy tool to help educate Myanmar users on digital literacy and online hoaxes, hate speech made to dehumanize, yet Facebook, again, has done nothing. The company won’t commit to change.

In the United States, fake news largely contributed to the election of Donald Trump. Pro-Trump fake news spread on the platform by his largest fans and Russian bots (often one and the same) outperformed real news about the election. Facebook’s news feed algorithms were tuned to help popular posts become viral. They didn’t think to block hate speech, fake news, or foreign influence before it was too late.

Popular fake and legitimate news that spread on Facebook

In the wake of the election, Facebook has partnered with Pointyr to fight fake news on the platform. Part of that goal is to ensure that the Poynter Institute, an international fact checking network, can verify sources of news on Facebook. This ensures that Facebook’s news feed favors news sources from verified and trusted organizations, and can label those that are potentially fake news. Myanmar has no Poynter Institute-approved sites for journalism.

Facebook, with its Facebook Journalism Project, could help find and elevate legitimate sources of news. In fact, with Facebook’s budget and spread in Myanmar, they could easily become that source of verified news with an official online news page within Facebook. In Myanmar, Facebook is in a unique position to directly fight against fake news by publishing their own content or sharing it from across the web on an official Facebook page. Instead, they choose not to take this simple action.

Moving the Goalposts

Finally, this is a criticism on Facebook’s attitude and messaging. No one has painted Facebook as the sole reason Myanmar Buddhists targeted the Rohingya. Violence and prejudice against the Rohingya has been prevalent in Myanmar for generations. You wouldn’t know that to read Facebook’s blog post. Facebook seemed steadfast on trying to ensure that no one blamed Facebook solely for this violence. The problem is, no one is putting that blame entirely on Facebook. Facebook contributed to genocide, it did not cause it. But don’t let Facebook move the goalposts here, contributing to genocide but not directly causing it is not a worthy goal.

Facebook Hasn’t Learned

Facebook thumbs downThe UN and now an independent organization have joined activists in calling out Facebook for its inaction. Facebook continues to do little to prevent the spread of hate speech. In Myanmar, it lead to genocide. German citizens are more likely to commit acts of violence if they use Facebook. In the U.S., it may also fuel xenophobic attacks, nationalism, and lead to the election of Donald Trump. As long as fake news and hate speech are profitable for the company, Facebook will not work hard to eliminate the threat. The Rohingya were driven out of their homes, raped, and murdered. This was ethnic cleansing, and, while it may have happened without Facebook, Facebook helped ensure this genocide happened. Rather than do everything they can to prevent the next travesty, Facebook is trying to do the bare minimum to appease activists and improve public relations before moving on.

It’s social networking 101. Twitter has done the same with their hate speech and harassment problems. As long as the outrage dies down, Facebook won’t change.


Sources/Further Reading: