AP News Report: Despite Genocide, Facebook Still a Source of Hate in Myanmar

Reading Time: 8 minutes.

Facebook on an iPhone with a dark background.In 2018, hate that began spreading in Myanmar, largely over Facebook (parent company: Meta), spilled over into violence. It lead to private and military action against Myanmar’s Muslim population, the Rohingya. During the genocide, at least 700,000 Rohingya people were either driven from their homes or murdered. One harrowing report told of babies ripped from mothers arms and thrown into fires. Mobs raped women and girls, while killing boys and men. Many fled, but many were not so lucky to make it out of the country.

A United Nations report pointed much of the blame at hate speech that spread on Facebook. Facebook pledged to do better. However, since then, they haven’t added enough native Burmese speakers for moderation. Their insufficient efforts pale in comparison to the growing rifts on Facebook after the military coup in the nation this year. More violence is likely on the horizon.

An AP News report looked into leaked internal Facebook documents, finding even Facebook employees admit their efforts to stop hate are nowhere near enough. So how does a billion dollar company continue to struggle to fix the problems of hate speech and spreading violence on their platform?

Just as it is in nations like the United States, Facebook has to balance profitable hate and safety, and seems to always pick hate over protecting vulnerable groups.

Facebook in Myanmar

Facebook has always had an interesting place in Myanmar. Myanmar, long under authoritarian military rule, only accessed the internet in 2000. Facebook (now Meta), which started expanding globally before the end of the decade, came to Myanmar with an interesting proposition. The company made deals with carriers so people in Myanmar could access Facebook without having to pay for data. Data was extremely expensive at the time, so this was a game changer. But it also influenced social networking culture in Myanmar. Anything on Facebook was free. Step outside of those bounds, and you’d pay for it, literally.

This could influence how people use the platform and how they could become more susceptible to disinformation spread on the platform. If stepping off Facebook’s gilded path means paying up, a person looking to save money would be less likely to fact check anything. If someone sends them a photo of something terrible with some text to go with it, they can’t easily go search for the issue and try to find legitimate sources of news that could contradict fake news. In this way, people became trapped on the platform. Most news could spread directly on Facebook, rather than involving links to external platforms. Facebook set out to be the internet in Myanmar. They succeeded.

In Myanmar, Facebook and the internet are largely synonymous. That means all of Facebook’s most common issues, misinformation, fake news, hate speech, became a core part of Myanmar’s online culture. That culture, obviously, spills out into the real world. Facebook helped terrorists plan a coup attempt in the United States. In Myanmar, they pushed people and a military towards genocide. Hate that started building up in 2013 came to a head five years later, when the genocide of the Rohingya begun.

The Rohingya Genocide and Facebook’s Promises

Then, in 2018, a coordinated attack that included civilians and military lead to 700,000 or more Rohingya displaced or killed. A U.N. report pointed the blame at hate speech that spread like wildfire on Facebook. Hate and misinformation outnumbered real information, much like it did in the U.S. prior to the 2016 election and again in 2020. When the blame of the world was on Facebook, Facebook promised to take action.

It was clear what Facebook’s problem was: investment. They had far too few first language moderators, and their automated hate speech detection tools didn’t even know slang and slurs for the Rohingya and other Muslims. It was as though Facebook was doing nothing to fix the problem until it came to genocide. Facebook promised to hire more native speakers and improve their systems. However, over the next three years, employees would repeatedly raise concerns. Facebook wasn’t doing enough. Now the nation is once again lead by the military after a successful coup, and support for their violent authoritarianism spreads through disinformation on Facebook like never before.

Facebook’s Ongoing Myanmar Problem

Facebook logo with red glow on dark background

You’d think being a leading cause of a genocide would be enough to convince Facebook they need to make drastic changes. However, accord to employees at Facebook, the company didn’t do enough. While they made some changes, most were unmaintained, unused, or ineffective. Employees made formal complaints about Facebook’s inactive tools. In May of 2020, one employee pointed out their hate speech text classifier wasn’t being used or maintained. Just a month later, an employee reported finding “significant gaps” in their misinformation detection, specifically for Myanmar. Facebook’s efforts were seemingly for PR purposes only. According to Facebook whistleblower Frances Haugen, Facebook frequently acted “only once a crisis has begun.”

Copious Hate and Calls to Violence

This year, the violent, nationalistic, and authoritarian military in Myanmar once again took control of the country. On Facebook, you can see both remnants of the hate that got them to this point, as well as disinformation campaigns from the military and their supporters. Here, anyone opposed to military rule is framed as an extremist, with fake news stories and images to back up their false narrative.

For example, a video on Facebook spread of the Sinaloa cartel in 2018 butchering bodies. However, the description claimed the violence was actually opposition forces killing Myanmar soldiers this year. Another video with over 56,000 views, called for death of protesters, saying, of the opposition, “So starting from now, we are the god of death for all (of them). Come tomorrow and let’s see if you are real men or gays.” The video called for violence and featured homophobic hate speech, yet received tens of thousands of views before Facebook took action.

Another account doxxed a military defector and his wife, posting their home address along with a photo of soldiers bound and blindfolded, saying, “Don’t catch them alive.”

The Myanmar Social Media Insights Project found coordinated efforts from the military and their supporters (or the military posing as supporters) to misinform the public. They specifically attempt to drive hate against activists, journalists, and, of course, ethnic minorities. The same hate that lead to genocide in the country also drives the military’s populism. Despite knowing the military is doing this, Facebook hasn’t outright banned accounts that participate in these disinformation campaigns. As a result, it continues to spread.

Facebook’s Measures for Myanmar

Facebook seems to take offense to criticism like this, but when it’s coming from both watch groups and their own employees, it’s clear the problem isn’t the criticism, but the (lack of) actions taken. Facebook has taken actions since 2018, they just haven’t been enough to stop the problem. Perhaps that’s because, even according to their own employees as recently as 2020, Facebook’s not actually using or maintaining many of these tools.

“Facebook’s approach in Myanmar today is fundamentally different from what it was in 2017, and allegations that we have not invested in safety and security in the country are wrong.”

– Rafael Frankel, Facebook

No one’s saying Facebook has done nothing, only that they haven’t done enough. But what has Facebook done? They created a list of “at-risk countries” for the “critical countries team.” This allowed them to identify nations with problems that Facebook had a hand in and could work towards fixing, like rampant hate speech. Facebook gave Myanmar a “tier one” rating.

Facebook made a list of countries that didn’t have appropriate language support, including Burmese in Myanmar, as well as Ethiopian languages, Bengali, Arabic, and Urdu. Along with that, and in response to genocide, they finally added hateful slang for the Rohingya and Muslims. This should have been part of the program to begin with, and especially since hate started ramping up on Facebook five years before the genocide. Facebook, as Haugen pointed out, only reacts after a crisis, never to prevent them

Facebook improved their automated systems in other ways as well. They taught the system to differentiate between different accounts created by the same person to spread hate, as well as tracking repeat offenders. They then used this to “demote” content from them, so it’s not seen as often. However, in a nation where people follow Facebook pages, not RSS feeds, this doesn’t cut off the flow of misinformation as much as it does in nations where demotion works better. They’ve also demoted reshared content, by tracking “reshare depth,” they can stop items that may spread virally despite being untrue.

Facebook did take tools they made for Myanmar and apply them elsewhere. According to a Facebook spokesperson, Ethiopia is seeing more success with their demotion of hate spreaders than Myanmar.

Along with automated efforts, Facebook also said they’d bring their employment of native Burmese speakers up to 100. This, according to Facebook as well as Facebook whistleblower Frances Haugen, is still a leading problem in Myanmar.

The Language Problem

Facebook seems to have a problem with language in many nations currently in turmoil. Whistleblower Frances Haugen was concerned about Ethiopia in interviews, where she believes Facebook doesn’t have the language capabilities to moderate their platform as the nation faces a civil war. Facebook often claims they have the same problem in Myanmar, where they employee somewhere over 100 native language speakers to moderate the platform.

In The United States, hate speech and disinformation helped drag out a pandemic and helped right-wing terrorists plan a coup. However, Facebook’s moderation is actually far worse in nations where Facebook doesn’t understand the local language or customs. Facebook tries to throw automated solutions at everything, but these are usually insufficient. These automated systems need native speakers to set them up, monitor, and update them.

A person unfamiliar with the political landscape simply can’t keep up with slang and memes associated with movements. For example, someone without knowledge of the political climate in the U.S. wouldn’t know to connect “Let’s go Brandon” with vaccine misinformation. That’s why Facebook needs native language speakers not only to moderate content, but to train language models, teach them slang, and help them find connections that computers miss. In many nations where Facebook is prevalent, they simply haven’t allocated the resources to properly moderate their platform.

A Refusal to Invest in Improvement

It’s true, a lack of native language speakers is a serious problem for Myanmar. The Burmese language is dissimilar from many other languages, meaning a native speaker is best capable of detecting nuances and slang. Fortunately, there are 33 million native Burmese speakers, and at least 10 million others who report speaking the language. Surely, then, Facebook has thrown thousands of native speakers at the problem? They’d be able to train Facebook’s automated systems better, and would themselves be able to flag more hate speech.

They hired 100.

In 2018, Facebook promised to hire 100 native Burmese speakers to cover their—at the time—18 million users in Myanmar. When removing pages spreading hate, Facebook found that 66% of their users in Myanmar followed those hateful pages. With a majority of their users following spreaders of hate speech, surely Facebook saw fit to hire much than 100 people over the next three years, right? As their user base grew, surely they’ve added got hundreds, if not thousands of native speaking employees, right?

“[Facebook] has built a dedicated team of over 100 Burmese speakers.”

– Rafael Frankel, Facebook director of policy for APAC Emerging Countries, 2021 to AP News

Wrong.

Somewhere over 100 (but likely significantly less than 200). That was pathetic in 2018, when Facebook announced the program. Now, in 2021, there may be around 28.7 million Facebook users in Myanmar. Even going off of Facebook’s flawed logic, that 100 people would be enough to support 18 million people, how could they still stand by that when they’ve added 10 million users? Especially considering the fact that at least one employee has reported that Facebook isn’t using or maintaining their hate speech detection in Myanmar.

They’re not automating, they’re not hiring enough people, what’s the problem?

Facebook is a multi-billion dollar company. Surely they could spend a tiny fraction of that to stop genocide, right?

Well, it’s just not enough of a priority, apparently. Either Facebook is okay with people dying due to their inaction, or they’re just in denial about their role in violence. Facebook has been at the center of a resurgence in authoritarianism, hate, and political distress all over the world, how could they still deny that?

Inadequate Efforts for a Dire Situation

“Can you imagine having your entire adult identity wrapped up in something? Like, this is the only thing you’ve done? Can you imagine how hard it would be to admit that it has hurt people?”

Frances Haugen, on her empathy for Mark Zuckerberg

 

Facebook whistleblower Frances Haugen may have nailed the core of the issue while trying to show empathy for Zuckerberg’s position. Zuckerberg has one creation: Facebook. It’s possible that a sort of denial keeps him from identifying and going after its problems. That could be a worse issue than the cost associated with fixing hate on the platform. To spend a great deal to fix your platform of hatred, bigotry, and violence, you have to admit it has become nothing more than a vile place of toxicity. Perhaps Zuck’s just not ready to do that. Once again, the world suffers to protect another rich man’s ego.

Gif of the logos for Facebook, Instagram, WhatsApp, Messenger, and Oculus rolling into the Meta logo

Facebook is desperately trying to rebrand with Meta and a product no where near completed… or even started.

Facebook has more than enough money to hire local moderators. They could work to improve fact checking on the site, knowing that most people won’t do it themselves. The company could stop the violence they spread. Facebook is a multi-billion dollar company. They have the assets to fix these problems. But is it greed or Mark Zuckerberg’s refusal to admit there’s a problem with his brainchild that keeps Facebook at the forefront of global violence?


Sources:
,