Former Facebook Employee: “I Have Blood on My Hands”

Reading Time: 9 minutes.
Facebook logo with red glow on dark background

I turned down a recruiter from Facebook once. I told my friends I needed a job, but I wasn’t going to sell my soul for six figures and a fantastic cafeteria. Others may not think Facebook is as bad as they’ve heard. On the surface, when you visit the offices, you realize it’s actually quite pleasant. Employees are kind, they have amazing food, and the offices are immaculate. The recruiters are very understanding and honest. Everything, on the surface, looks great. Everything I saw made Facebook seem like a fantastic place to work. But, according to leakers and former Facebook employees, that hides what working there is actually like, and it hides the affect Facebook has on the world.

A former Facebook employee fessed up in an internal memo. She lost her job for trying to protect democracy, freedom, and the journalism around the world. Facebook fired her for asking for more resources to control the growing number of bots and troll farms using Facebook to influence elections and cause civil unrest in countries like the United States.

“I have blood on my hands.”

False information and troll farms on Facebook are far worse than we realize, and Facebook is both ignoring the issue, and making sure people and governments never know how large the problem really is.

Author’s Note

I’ve redacted the full name of the Facebook data scientist who wrote this memo. BuzzFeed apparently obtained it and shared it without her permission. They falsely lead readers to believe she was their source by not mentioning their actual source. BuzzFeed actually obtained the memo through someone else at Facebook.

Though many would agree with her actions, I do not want to potentially damage her career in the future. Whistleblowers shouldn’t face hardship. Her friend claims BuzzFeed posted this without consulting her, so I’ll leave her name out of it. However, it is in my sources below, as I can’t control them, and I still want to ensure that my readers know where information in my posts come from. Be responsible with this information when sharing or discussing this story.

Who Wrote This Memo?

“In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions.”

– Memo Author, a former Facebook data scientist who worked on Facebook’s Site Integrity fake engagement team

This engineer was a data scientist working for Facebook’s Site Integrity fake engagement team. This means she handled and analyzed large amounts of collected data to figure out the difference between real and fake users. It might seem impossible to tag fake accounts on Facebook, but for many bots, it’s not. Due to the way that Facebook tracks users across the web and even in real life, they can often tell the difference between real users and fake users.

Not all fake users are so easy to spot. Some may be real people who don’t even realize they’re a tool of a government, spreading misinformation for political reasons. In other situations, hackers use an existing account to share misinformation. It’s not an easy job, but Facebook has the data, they just need to figure out how to use it.

That’s where the woman who wrote this memo comes in.

What’s In Her Memo?

“I have blood on my hands…”

The former engineer sacrificed a sizable severance package ($65,000) in order to discuss these problems with her coworkers, so it’s worth a read. If you’re not aware, many companies force employees to sign non-disclosure or non-disparage agreements when they either fire them or lay them off. This is so they won’t speak poorly of the company, but it also often means a person cannot mention anything bad that the company has done. It’s a way to silence an industry, so we can’t report on its racism and sexism. It’s also a way to keep former employees from airing the company’s dirty laundry.

In her time at Facebook, the memo’s author was everything from a “part-time dictator” of some countries to the only person holding back a disinformation campaign threatening the freedom of another. She held too much power, with no oversight, and a workload so intense, she actually offered to continue doing it after Facebook fired her.

Facebook fired her because she asked for more human resources. She was analyzing Facebook interactions in her spare time, outside of work, because she had so much to do and the work was so important. Facebook has become a tool for despots all over the world, and without careful moderation, it can spiral.

Now, one of Facebook’s most dedicated guardians against “Coordinated Inauthentic Behavior” (bot traffic) has left the company, and she did so because Facebook refused to take her advice for improvement. Facebook can only get worse from here.

Millions of Accounts and Interactions Removed Without Notifying Public of Their Impact

If Facebook found something on their platform and removed it, you’d want to know about it, right? Maybe not if it’s one or two individuals posting hateful content, but surely you’d want to know about bots or troll farms with over 10.5 million interactions on Facebook, right? Because knowing what those were could help you avoid such fake sources of news, and hold Facebook accountable for its removal. They’d have to make a habit of cleaning up such nonsense, rather than ignoring it when it’s convenient to do so. So, given the importance of knowing about large-scale attacks on democracy, you’d expect Facebook to tell you about them, right?

Well, they don’t.

“We ended up removing 10.5 million fake reactions and fans from high-profile politicians in Brazil and the U.S. in the 2018 elections – major politicians of all persuasions in Brazil, and a number of lower-level politicians in the United States.”

– Memo Author, Former Data Scientist at Facebook

Facebook hides their dirt so they don’t have to make a system of cleaning it up regularly. These incidents in such a small amount of time prove that Facebook has a much larger problem with troll accounts influencing elections than they’ve led governments and users to believe. Worst of all, they fire people for pointing out that Facebook isn’t doing enough to prevent the problem.

Facebook knows it’s harming democracy around the world, they just don’t care enough to do anything about it.

Only Cleaning Up Serious Issues if Press or Lawmakers Catch On

Facebook CEO Mark Zuckerberg arrives to testify before a joint hearing of the US Senate Commerce, Science and Transportation Committee and Senate Judiciary Committee on Capitol Hill, April 10, 2018 in Washington, DC.

Mark Zuckerberg speaking to congress. Photo: JIM WATSON/AFP/Getty Images

It’s an open secret within the civic integrity space that Facebook’s short-term decisions are largely motivated by PR and the potential for negative attention… It’s why I’ve seen priorities of escalations shoot up when others start threatening to go to the press, and why I was informed by a leader in my organization that my civic work was not impactful under the rationale that if the problems were meaningful they would have attracted attention, became a press fire, and convinced the company to devote more attention to the space.

Overall, the focus of my organization – and most of Facebook – was on large-scale problems, an approach which fixated us on spam. The civic aspect was discounted because of its small volume, its disproportionate impact ignored.

– Memo Author, Former Data Scientist at Facebook (emphasis added)

Facebook had their employees mostly concerned with spam. They prevented employees from doing any work to prevent civic upheaval. Basically put, unless a member of the press or a politician found proof of a bot network, Facebook wouldn’t prioritize it.

“… It became impossible to read the news and monitor world events without feeling the weight of my own responsibility.”

– Memo Author, Former Data Scientist at Facebook

This is where things get particularly nasty. Facebook didn’t allow this data scientist time to fix the global issues she was seeing. Facebook was directly contributing to the campaigns of despots around the world by ignoring their attacks on democracy and truth. Therefore, she took it upon herself. This lone data scientist, working in her own time, was often the only person who could protect a country. Or, by not being able to dedicate time to fixing the problem, doom it.

“Most of the world outside the West was effectively the Wild West with myself as the part-time dictator.”

– Memo Author, Former Data Scientist at Facebook

Taking Down the Wrong People

Facebook wasn’t just slow to react when they had an international crisis on their hands, one of their own making, they also stood in the way of positive change. Actual people, journalists and activists, became targets for harassment. Troll campaigns would report the person for spam. If enough people do it, Facebook would just automatically ban the activist or journalist. In the memo, Facebook’s former data scientist hero stated that it took weeks to restore someone’s access. All while bots continued to proliferate fake news on the platform.

Fired for Pushing for More Resources

 

“I have personally made decisions that affected national presidents without oversight, and taken action to enforce against so many prominent politicians globally that I’ve lost count. … With no oversight whatsoever, I was left in a situation where I was trusted with immense influence in my spare time. A manager on Strategic Response mused to myself that most of the world outside the West was effectively the Wild West with myself as the part-time dictator – he meant the statement as a compliment, but it illustrated the immense pressures upon me.”

– Memo Author, Former Data Scientist at Facebook

The memo author had little to no oversight, and worked on civil issues during her own time, as Facebook didn’t care enough about them to let her work on them during business hours. This meant one person sometimes held the fate of entire nations in her hands. Misinformation campaigns could change elections or pandemic responses. She had enormous responsibility, which she didn’t want. In fact, Facebook fired her for asking for more coworkers and managers. They fired her for asking for more resources to tackle an issue so big, she was doing it in her spare time, and still couldn’t keep Facebook safe.

The problem is, Facebook just didn’t care.

“Local policy teams confirmed that President JOH’s marketing team had openly admitted to organizing the activity on his behalf. Yet despite the blatantly violating nature of this activity, it took me almost a year to take down his operation.”

– Memo Author, Former Data Scientist at Facebook

After taking down a particularly large bot operation, Facebook bragged about it on their blog. However, the operation restarted a short time later, and Facebook never informed anyone. Employees would tackle the problem, but Facebook didn’t have them put resources on it. Often, it would just stay up for months at a time. The memo author commented on this, stating, “Perhaps they thought they were clever; the truth was, we simply didn’t care enough to stop them.”

Preventing bot traffic and election interference was never Facebook’s goal. Making it look like they stop this behavior is the only reason they do anything for civic bot traffic.

Guilt

Being the only person in the room standing up for democracy worldwide can be a challenge. With so many people attacking truth and democracy around the world, it’s like staring down an army by yourself. That’s exactly what she had to do, stare down troll armies by herself. She did, but time is limited. There are only so many issues you can confront, especially when you’re using your spare time. She had to leave some international problems on the back burner.

Those issues included disinformation in Ecuador, supporting the ruling government. She decided not to prioritize it. Later, the COVID-19 pandemic would ravish the country, something she could have helped to prevent if she had taken down the troll farms. The same thing happened in Bolivia, when a candidate used a troll army to bolster their position. Later, the president would resign, and mass protests broke out, leading to multiple deaths. All because of some disinformation spread on Facebook.

Information she could have prevented, had Facebook given her the resources and time to do so.

Though the fault lies on Facebook, the memo author showed a great amount of guilt when discussing these issues. However, realistically, there was little she could have done. This was Facebook’s problem, and, despite warning them, the company refused to take appropriate action.

Toxic Work Culture

Mark Zuckerberg, Dan Rose, and Sheryl Sandberg of Facebook. Walking together outside on a sunny day

Mark Zuckerberg, Dan Rose, and Sheryl Sandberg of Facebook
Photo: Drew Angerer/Getty

“In the office, I realized that my viewpoints weren’t respected unless I acted like an arrogant asshole.”

– Memo Author, Former Data Scientist at Facebook

Facebook’s disgruntled data scientist painted a nasty picture of Facebook’s working environment. She was unable to get the resources her department needed, forcing her into some regrettable prioritization. However, she continued to fight for it, and found progress came only when she exemplified the worst in the people around her. Abrasiveness and arrogance became the native languages at Facebook, and the only things that got anything done.

Even after losing her job, she offered to stay on as an unpaid volunteer, at least through the U.S. election. Disinformation campaigns from both within the U.S. and from Russia got President Trump elected. Facebook’s on the path to do this again. Still, the memo author urged her fellow employees not to quit trying to make Facebook better. They can’t take down the company, but they may be able to reduce the damage it does.

“But you don’t – and shouldn’t – need to do it alone. Find others who share your convictions and values to work on it together. Facebook is too big of a project for any one person to fix.”

– Memo Author, Former Data Scientist at Facebook

No Clean Hands

Reading through the excerpts of her memo, I can see someone conflicted. She’s a person with a strong sense of duty and accurate moral compass. She feels guilty about the work she couldn’t get done, the prioritization she had to do, and the lives that prioritization may have cost. She sounds like someone who was trapped in an abusive relationship. I know exactly how she feels. I’ve offered to help or document work even after being laid off before. It’s so easy to blur those lines and offer to do a little work for free when your everyday job so frequently requires unpaid overtime. When an abusive working relationship becomes normal, it’s hard to tell when it ends.

The person who wrote this memo believes she has blood on her hands, but I’m not so sure she does. To ask one person to hold back an army is to ask too much of one person. She had an understaffed team tasked with an impossible job, and performed it admirably. Above and beyond, in fact.

There’s blood on someone’s hands, but it’s not the person who recognized the problem and asked Facebook to solve it. That blood is on the hands of the Facebook executives and management who allow bots and trolls to dismantle democracy and bolster hate and ignorance around the world.

But, perhaps even those Facebook employees can’t take all the blame. By continuing to use Facebook, we enable this as well. We give them a reason to cut costs on safety because it doesn’t affect the bottom line. Facebook’s users have created a system where Facebook can do great acts of evil without repercussions.

I have blood on my hands. You probably do too.


Sources: