Leaf&Core

Facebook’s Deception Defines the Social Network

Reading Time: 13 minutes.
Facebook CEO Mark Zuckerberg arrives to testify before a joint hearing of the US Senate Commerce, Science and Transportation Committee and Senate Judiciary Committee on Capitol Hill, April 10, 2018 in Washington, DC.

Photo: JIM WATSON/AFP/Getty Images

The New York Times recently shared an article titled “Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis.” It is damning. Facebook knew Russian agents had used the service to influence the 2016 election. They knew foreign influences had undermined our privacy and democracy before ever announcing it, and didn’t mention it in reports because it wouldn’t be profitable.

They covered this up through some truly dastardly means. Not the least of which was hiring a far-right opposition research group—typically used in politics to get dirt on political rivals—to drive negative news about Facebook’s competition. A sea of denial and deception followed. Facebook behaved as though they, as a company, not only had business rivals, but were in competition with the politicians elected by U.S. citizens.

Facebook waded into dirty politics and now it’s drowning in its murky depths.

Cambridge Analytica

We should start with some background on a previous Facebook scandal. In 2014, a Russian-American psychology professor at Cambridge University, Aleksandr Kogan, released a personality survey and downloadable app on Facebook. The quiz was a personality quiz, extremely popular on social networks at the time. Users had to sign in with Facebook to take the quiz. It was incredibly detailed, but users felt safe, as the person asking for it was a Cambridge University professor and he was using the data strictly for his work at Cambridge University’s Psychometrics Center.

Steve Bannon, one of Cambridge Analytica’s founders and former Trump advisor. Photo: AP

He wasn’t honest about the data he collected or his intentions. Kogan gave the data to Cambridge Analytica, a conservative research group funded by Robert Mercer and Stephen K. Bannon. Donald Trump used Cambridge Analytica’s services during his campaign extensively, and would eventually invite Bannon to be a member of his staff.

Trump’s campaign managers used the data to compile voter profiles on nearly every eligible voter. They knew how you’d likely vote. Using this, they were able to map out locations for rallies and even talking points to bring up at those rallies. As a result, Trump’s fanbase was energized, and some moderates heard exactly what they wanted to hear. They had no idea they had their information scraped without their permission to form those talking points.

Cambridge Analytica was also attached to SCL. SCL helped the Brexit campaign in a similar fashion. Thanks to their treasure trove of personal data, deceptively (although not technically legally) obtained, another election upset happened across the pond—Brexit. The U.K. voted to leave the European Union largely over nationalistic ideals and racism regarding Muslim immigrants and refugees.

Facebook Knew Then, Too

Facebook knew of the leak before the New York Times, The Observer of London, and The Guardian went public with their findings. In 2015, Facebook changed their rules in response to the not-yet-public intrusion. Facebook app developers could no longer gain access to the data of your friends if you use their app. However, for 250 million people, around 93% of eligible American voters, it was too late. Our data was used to help undermine both American and British democracy.

Allowing Racism

Photo: AP Photo/J. Scott Applewhite

Jewish families around the world have a different connection to the past than other families. While the Holocaust may be a lesson in history books—a damning view into the nature of human cruelty—it’s just that, a lesson. For Jewish families though, it’s something different. It’s an experience etched into their personal history. It’s not just a harrowing picture in a textbook, it’s a story your grandfather tells you. About how hate started small, and grew. About how they were “tolerated,” disliked, hated, and eventually hunted. It’s a tattoo on your grandmother’s forearm, a tear in her eye, and a tragedy in her mind. For families that exist today who only live because an ancestor survived or fled that horror, the hate of the Holocaust is personal.

Seeing that kind of hate in the modern day is gut-wrenching and horrifying.

The Muslim Ban

Trump announced his Muslim ban on Facebook. It was a clear violation of Facebook’s rules against hate speech.

In December of 2015, Donald Trump sat down in front of a computer or held a smartphone in his hand. He shared a link to his website on Facebook calling for a “total and complete shutdown” of Muslims entering the United States. It was a racist and Islamophobic statement. It was a presidential candidate of the United States calling for religious-based discrimination. No hesitation, no guise, no veiled wording, no dog whistles. It was clear hate speech. Trump vilified people of an entire religion.

15,000 people shared Trump’s anti-Muslim statements on Facebook.

Mark Zuckerberg is of Jewish descent, and has a different connection to religion-based hate speech like this. He has worked hard on immigration reform. Zuckerberg was hurt that his platform was used to spread hate. He spoke with several employees, asking if there was anything he could do. He went to his Chief Operating Officer, Sheryl Sandberg, author of the feminist book Lean In, and asked her if Trump violated Facebook’s terms of service. She, herself a Jewish woman, couldn’t directly weigh in. She brought in outside help.

Joel Kaplan, a Harvard alum who went to school with Sandberg, came to Facebook in 2011. He was a Republican, and had served in George W. Bush’s administration. Top brass at Facebook discussed the issue at great length. Many at Facebook were disgusted by the idea of anyone using their platform to spread hate. But when it came down to the final decision, Kaplan’s warning of, “Don’t poke the bear,” weighed heavier in their minds than a moral compass.

Removing Trump’s post would cost them those hateful users who agreed with Trump’s message. Facebook, eager to grow as large as possible, left Trump’s message of hate on Facebook. According to Kaplan, this would help Facebook retain conservative and far-right users.

In the end, when Facebook had to decide between its soul and growth. Facebook grew, but not without paying a terrible price.

Facebook’s only interested in selling your information, not you.

Russian Influence

Many of us were shocked by the election results in 2016. Donald Trump, the man with a history of racist comments, a platform that was explicitly racist, Islamophobic, homophobic, and transphobic, a man who bragged about sexually assaulting women, was the new president. No one expected this. Except, perhaps, those at Facebook who had, months before the election, learned of Russian influence to sway the election with fake and misleading news to help Donald Trump win the presidency.

From a joint report on Russian influence ov er the 2016 election

 


If you weren’t aware, a Russian misinformation campaign sought out to disrupt American politics and instill Donald Trump as the president of the United States. You can read more about their efforts here. Unfortunately, we’re still uncovering ways the Russians disrupted our democracy and may have influence over American politicians.


Russian Intrusions

Alex Stamos lead Facebook’s security team in 2016. That spring, his team discovered evidence of Russian hackers attempting to pull information from accounts connected to presidential campaigns. This information could be used in phishing attacks, like the one that breached the Democratic National Committee. These Russian accounts were also attempting to message and share information with journalists. Russians were using Facebook as a primary means to interrogate journalists for information, make deals, and phish data from campaign staffers.

Russian Misinformation

Alex Stamos, former CSO of Facebook. Photo: Brendan Moran / SPORTSFILE / Web Summit

After realizing that Russians were spreading fake news on the platform, Stamos contacted Colin Stretch, Facebook’s general counsel about his group’s findings. Mr. Stretch informed Stamos that, unfortunately, Facebook had no policy that could stop Russia’s disinformation campaign. Without anyone’s permission, Alex Stamos had his team investigate Russia’s personal intrusions and misinformation campaign.

In December of 2016, Mark Zuckerberg publicly brushed off the idea that Russia could have influenced the U.S. election using Facebook. Stamos, giving Zuckerberg the benefit of the doubt, and perhaps more credit than he deserved, was shocked. How could the CEO not know about this? He called a meeting with Zuckerberg, Sandberg, and other top Facebook leaders. Stamos believed they’d want to know about such malicious use of their service, and want to put a stop to it.

He was mistaken.

Denial, Anger, and… Acceptance?

Sheryl Sandberg met Stamos’ news with anger. Not at Russia, or Facebook’s policy writers for enabling this kind of behavior. She was mad at Alex Stamos, the man who brought the problem to their attention. Her frustration with him would only grow after he detailed the problem to company board members. She yelled at him, stating “You threw us under the bus!” after a meeting. She said that looking into Russian interference without approval could put Facebook in legal jeopardy, but no source could explain what jeopardy she spoke of.

Going off of her anger and accusation that Stamos threw her and others under the bus, she may have been more worried that knowing about this problem made her responsible for fixing it, especially in the eyes of lawmakers. After all, the board members just wanted to know why they hadn’t been told sooner.

Stamos dropped a huge responsibility at the feet of Facebook’s leadership, and no one wanted to pick it up. That’s when Facebook’s denial and deflection truly began.

Fear of Republicans

Joel Caplan and Mark Zuckerberg. Photo: Tom Brenner/The New York Times

No, this bit isn’t about how marginalized groups fear Republicans because of what they tend to do to our rights, this is about how Facebook decided to hide truths that would be triggering to delicate Republican snowflakes.

Project P

Despite their anger, Mark Zuckerberg and Sheryl Sandberg agreed to continue Alex Stamos’ investigation. The investigation was given the code name “Project P,” for “Propaganda.” In January of 2017, just a month after Stamos revealed his initial findings to Facebook leadership and board members, they realized something truly horrifying: Facebook’s security team had drastically underestimated the scope of their Russian issue. By January of 2017, they realized Facebook had been a major tool—if not the primary tool—of Russians looking to dismantle the American democratic process.

Remember Joel Kaplan, the former George W. Bush employee? He believed that, by showing evidence that supported the U.S. intelligence community’s assessment of Russian influence over the 2016 election, Facebook would be picking a side. There were two undeniable facts here: the U.S. government had proven that Russians had influenced the U.S. elections, and Facebook had inadvertently been a tool used by the Russians. But facts weren’t what Republicans wanted to hear.

iStock illustration

The U.S. government’s investigation into Russian meddling had become partisan. Republicans refused to admit Trump had done anything wrong or that he won with help from nefarious parties. That included Trump’s pal Vladimir Putin, a man who defied Russian constitutional term limits to become president a third time.

Once more, Kaplan warned that, if Facebook pointed out the truth, that they had evidence that Russians used the network to influence the U.S. election, Facebook would lose the support of conservatives. He therefore recommended that Facebook, once again, choose profits over morals, and keep Russia’s tactics an internal secret.

Zuckerberg and Sandberg agreed.

Russians at Facebook? Nyet.

No Russian influence here!

In April of 2017, Facebook published their findings of their internal investigation. The word “Russia” wasn’t even in the report. 126 million people, over half the eligible voters in the United States, had seen Russian propaganda posts. A majority of Americans had been influenced by Russia. Facebook didn’t mention this until October of 2017, after Congress forced Facebook to hand over documents related to Russian attacks.

Alex Stamos left Facebook in 2018 after fighting, in vain, to get Facebook to fess up to their wrongdoings and make a change for the better.

Facebook, as it stands, is still a source for fake news, hate speech, and harassment. While it’s not as unregulated as Twitter, its universal reach and more personal touch make its more subtle influence far more dangerous. Facebook fears alienating their conservative users, and therefore pander to the far-right influencers using Facebook to spread hate and extremism.

Facebook started as a platform to connect us. Now it’s the ultimate tool of those seeking to divide us.

Definers

From the Definers website

After Congress finally stepped in and forced Facebook to fess up about Russian disruptions to our democracy using Facebook as the delivery system for those attacks, Facebook decided it was time to fight back. Yes, rather than admit fault, move forward, and decide to do better, Facebook decided to fight back against the court of public opinion and American politicians.

They did this by expanding their involvement with a political consultant group, Definers Public Affairs, often just called “Definers.” Definers takes dirty politics and apply it elsewhere, in business or other public relations. They work to ensure good news about your company or campaign bubbles to the surface, while bad news about your competitors goes viral.

The Good News: SESTA

First, Facebook focused on getting positive news out. To do this, they latched on to the Stop Enabling Sex Traffickers Act, or SESTA. I won’t go too far into the details of SESTA or FOSTA, as it’s a topic all of its own, but, if you ask anyone but a sex worker or the people who care about sex workers, it was a good thing. These laws, meant to curb sex trafficking online, would hold websites responsible for the content published on them. This means that police could target the owners of websites, rather than the sex traffickers themselves. Websites would have to ensure the content on their sites was not objectionable, though sex traffickers would simply find other means of selling and trading in slaves. Meanwhile, sex workers lost a method of finding safe work online.

Interpret that how you wish.

SESTA/FOSTA protestors. Photo: Wiktor Szymanowicz / Barcroft Media via Getty Images

Lawmakers considered the law to be an easy win. They could appear to protect women with minimal effort. Facebook, typically not a business that deals in sex trafficking (let’s save that for another day), also saw the law as an easy way to push out good publicity. By siding with it, lawmakers could see Facebook not only as a company that was open to regulation, but their ally as well.

Facebook followed Definers’ playbook on good PR. The next step was to pull their opponents down with them.

If You’re Already Dirty, Force them to Play in the Mud Too

Photo: PAUL J. RICHARDS/AFP/Getty Images

In politics, it’s not about how dirty you are, it’s about how dirty the voters think your opponent is. Have a long history of shady business deals, money laundering, racism, sexism, and sexual assault? Talk about your opponent’s private email server. The goal isn’t to make yourself look good all the time, just increase voter apathy within your opponent’s base. That way, while it’s still true that fewer people will vote for you, you can also ensure that fewer people vote for your opponent, leveling the playing field.

If you’re already covered in mud, sling some at your opponents, and, eventually, no one will be able to tell the difference. Instill apathy in a populace, and you can get away with anything.

Remember when conservative politics devolved into a discussion about “hand size” and other thinly veiled innuendo? Photo: REUTERS/Jim Young

That’s how it works in politics, but how could this work in business? You don’t win if you get more business than your competitors, you win if you beat the margins and make a profit. How does mudslinging benefit a company, and how would it even work?

Apathy. The goal for Facebook isn’t to get you to use their competitors less, it’s to make you apathetic towards privacy, hate speech, and even infringements on your own civil rights. The goal isn’t to make you use their competitors less, it’s to make you feel hopeless: “They’re all bad, I guess I’ll just keep using this then.”

Generating Apathy: Facebook Targets Google and Apple

Google and Apple might not seem like obvious targets for Facebook, but consider the intention. Facebook, who own Instagram, doesn’t directly compete with any other service in the U.S. Sure, you can go to Twitter, Reddit, or 4chan, but you’ll still find all the things you left Facebook for: hate speech, misogyny, racism, and Russian agents or bots. You can head over to Pinterest, but that’s really not the same as Facebook. Really, there’s no direct competitor to Facebook. The last company that tried, Google, failed miserably. No one used Google+, and Google has announced plans to discontinue the service.

So why Apple and Google? Because they’re seen as inescapable. Apple makes the most popular smart phones in the U.S., and Google powers your search engine, maybe your phone, your home automation, and maybe a whole lot else. In fact, if you’re using Google for anything, you’re likely giving away more information than you do on Facebook.

Google

Google’s the great data boogieman. They collect far more information than anyone realizes, and they use that information to make predictions, sell ads, and sell business insight. Basically, they sell you. In that way, they actually do compete directly with Facebook.

NTK Network, a conservative news website whose articles are picked up by Brietbart and other, less popular, far-right websites, posted articles attacking Google. They pointed out that Google collects swaths of data on users. This wasn’t a coincidence. NTK Network is an affiliate of Definers. They share officers, employees, and Definers employees even write articles for NTK Network. NTK Network is the mudslinging and disinformation wing of Definers and, through their partnership, Facebook.

Apple

Apple has always valued privacy more than their direct competition. While Google collects swaths of data from users to later sell, analyze, or use to improve machine learning, Apple anonymizes data they collect, considers data like location off limits, and does not store personal information. They also don’t sell user data or analytics.

When the Cambridge Analytica scandal broke, Tim Cook, Apple’s CEO, spoke up about Facebook’s reluctance to protect user data. He said that, while he typically believes that the best regulation is self regulation, “we’re beyond that here.” He also pointed out that he “wouldn’t be in this situation.” Cook implied that Facebook’s mishandling of user data was their own fault, and that companies that actually respect their users—like Apple—wouldn’t be in this situation.

Mark Zuckerberg was furious.

No iPhones Allowed

Zuckerberg demanded that all senior Facebook staff switch to Android devices. He claims it was not out of spite, but because Android is the most popular mobile phone OS in the world. Still, this doesn’t explain why you’d want all of your senior staff to switch to Android. As a mobile developer myself, I find it’s best to have both major operating systems at every level in the company. That’s how you can use your entire organization for quality assurance.

Zuckerberg’s move wasn’t strategic, it was a temper tantrum.

Facebook didn’t stop there. It also used NTK Networks to write articles about the fact that Apple collects information, making them seem as bad as everyone else. And, while it’s true, Apple does collect user data to improve Apple products (spoiler: every app developer does this), Apple anonymizes the data using differential privacy and other methods. This is how right-wing media outlets work though. They start with a grain of the truth, hide the details, and write sensationalist articles leading you to come to your own biased conclusions.

NTK Networks and Definers are professionals in this field, and they were trained on Facebook’s equals now. With all the mud slinging around, it became difficult to tell where it started from. Was Facebook guilty or was everyone doing this kind of behavior?

We now know: yes, Facebook is exceptionally guilty.

Deny, Deny, Deny

Mark Zuckerberg and Sheryl Sandberg both claim that they had no knowledge of Facebook’s work with Definers prior to the NY Times article. Obviously, this is difficult to believe. How could the CEO and COO of Facebook have no idea that the company has entered into an agreement and a disinformation campaign with another company? Those are absolutely the kind of details a CEO or COO would be involved in. Sandberg has since admitted that the deal may have come across her desk, but still claims she only learned of Facebook’s mudslinging work when the NY Times revealed it. Also, despite claiming that Facebook did nothing wrong by working with Definers, they’ve since ended their contract.

Illustration: Sam Jayne / Axios

Zuckerberg and Sandberg are lying. Not only would it have been impossible to make this kind of a deal without the CEO and COO involved, Facebook’s work with Definers was already public knowledge. In October of 2017, Axios reported on Facebook’s work with Definers. In fact, in his press conference, Zuckerberg inadvertently admitted that “the relationship with Definers was well-known by the media.” How could Zuckerberg and Sandberg both claim to have no knowledge of their work with Definers, while also claiming it was public knowledge anyway?

The book 1984 had a modified version of English called “Newspeak,” a way to describe their dystopian future. In Newspeak, this is called “doublethink.” It’s the process of holding two contradicting ideas in your head, considering them both true, even though only one could possibly be true. We have record of the media reporting on Facebook’s work in 2017. Therefore, the untrue thing they want us to believe is their lack of knowledge on the company’s top-level decisions.

Lean in or Step Back?

Both Mark Zuckerberg and Sheryl Sandberg have made no plans to step down over this release. Despite seemingly being caught in a lie, both will remain at the company. Zuckerberg owns 60% of Facebook, and, even if the board was unanimous against him, they couldn’t remove him. Sheryl Sandberg, meanwhile, has a fantastic personal brand that makes her a celebrity COO within Facebook. This gives her name recognition among feminist circles and with politicians. It gives her inside access to politicians. Despite her dishonesty, Zuckerberg would lose a powerful ally if he ever decided to replace her.

Sheryl Sandberg and Jack Dorsey. Photo: Tom Brenner/The New York Times

Though feminist groups have begun to side against Sandberg. Facebook’s support of FOSTA-SESTA put her at odds with feminist circles that support sex workers’ safety. Furthermore, she has allied with far-right conservatives who allow disinformation campaigns and have covered up Facebook’s involvement in the Russian campaign to undermine U.S. democracy. When hate speech ran rampant in the U.S., targeting Muslims, she sided with her conservative pals and allowed the hate speech to stay up, despite the fact that similar hate speech targeting Muslims in Myanmar lead to mass rape and genocide.

Sandberg has used her power and influence with politicians and women—some who found her book Lean In empowering—to push regulation that helped Facebook and hurt her competition. Many in feminist circles have turned against her. Mark Zuckerberg was the boogieman of Facebook. Sandberg was the intelligent woman at Facebook who kept Zuckerberg’s sometime creepy ambitions under control. Now we see her engaging in the same lies, deception, and mudslinging.

Facebook: Not to be Trusted

Mark Zuckerberg in 2005.

Mark Zuckerberg started Facebook in a Harvard dorm room. From there, it exploded, with billions of users all around the world. Below is a conversation Zuckerberg had with a friend in the early days of Facebook (emphasis added):

ZUCK: yea so if you ever need info about anyone at harvard
ZUCK: just ask
ZUCK: i have over 4000 emails, pictures, addresses, sns
FRIEND: what!? how’d you manage that one?
ZUCK: people just submitted it
ZUCK: i don’t know why
ZUCK: they “trust me”
ZUCK: dumb fucks

 

At least he was honest back then.



Sources/Further Reading:

Exit mobile version