Leaf&Core

Facebook’s Trying to Halt a Study into Their Political Ad Targeting

Reading Time: 4 minutes.
Ad spending charts reflecting the data from NYU's Ad Observatory

via NYU Ad Observatory

A group at New York University (NYU) is looking into political advertising on Facebook. The goal is to monitor the service and provide transparency, as Facebook doesn’t do this themselves. It could help us understand what drives Facebook to show some users certain ads, and whether or not Facebook is pushing an agenda with their advertising. Unsurprisingly, Facebook is upset.

Facebook is trying to force an end to the study. They also want the NYU researchers to delete all of the data they’ve collected. Just how bad could it be?

The Need for Such a Study

via NYU Ad Observatory

Facebook is, like many large tech companies, a “black box.” We don’t know what’s making it work. How does Facebook decide what to show in their news feed? We don’t know. Why does Facebook show certain ads to some people but not others? We don’t know. So, like any scientists, researchers have to do experiments to figure out what Facebook’s doing. This involves monitoring usage habits to find patterns, controlling inputs, and measuring outputs. Researchers can do this with browser extensions to record activity on Facebook and analyze what the users see.

Facebook allows political advertising on their site. This is where things get dangerous. Since we don’t know how Facebook decided how to display ads, who will see them, or how frequently people will see them, there’s too much room for manipulation. How can we trust Facebook to remove bias from their platform if they won’t show us how their advertising platform works? That’s where researchers at NYU come in.

NYU’s Study

via NYU Ad Observatory

Researchers at NYU are working on a project called the NYU Ad Observatory. It uses volunteers with a browser extension to gather the same kind of information Facebook’s gathering on their users. From there, they also measure what Facebook displays to their users. With this, they can figure out how Facebook’s algorithms “decide” what to show users. They can find motivations and biases easily.

You can view their ongoing data collection on their website. Some of it comes directly from Facebook’s publicly available information. The rest may come from the Ad Observatory browser extension.

Facebook’s Fear

Facebook has given NYU researchers until November 30th to delete all of their data and shut the project down. If the NYC Ad Observatory researchers refuse to do so, Facebook says they will “be subject to additional enforcement action.” Facebook says they may also change their code to block the researchers from collecting data, though it’s likely they’d just find another way to collect this information.

Scraping and Cambridge Analytica

“Scraping tools, no matter how well-intentioned, are not a permissible means of collecting information from us. We understand the intent behind your tool. However, the browser plugin scrapes information in violation of our terms, which are designed to protect people’s privacy.”

– Allison Hendrix, Facebook Privacy Policy Official

It’s easy to think Facebook is selfishly trying to protect their own concerns. After all, this is the company that continuously permits hate speech and violence in exchange for profits. If Facebook’s “secret recipe” got out, they could lose profits or users. They could even face legal scrutiny. However, they may also want to protect users, if, again, only for profitable reasons.

A while back, a conservative think tank used Facebook data scraping to mine large amounts of information from Facebook’s users. These analytics were later used by the Trump campaign and other conservative politicians to win races. Facebook didn’t tell anyone about the security problem until much later, after it had leaked. Facebook lost users, saw boycotts and protests, and had to face politicians about their mistakes. As a result, we got a peek behind the curtain, and Facebook was never the same. Suddenly, the whole world knew just how untrustworthy the platform is.

For the sake of protecting the privacy of their users, Facebook tightly controls how they leak out user data. But NYU’s program proves it’s not perfect, and still requires the adherence to Facebook’s rules. If anyone wants to break those rules, troves of Facebook data are up for the taking.

Facebook’s Bias

Of course, outside of the privacy issues, Facebook has many other reasons to want to hide how they treat political donations on Facebook. For example, according to NYU’s data, the Trump campaign spends far more on Facebook advertising than the Biden campaign. This is likely because Facebook, the great spreader of misinformation that it is, is more popular among conservatives and older users than younger people. However, what if your dollar goes further on Facebook if you’re a conservative, so liberals use it less often? Alternatively, what if liberals find the spread of their posts don’t require as much money as conservatives? If there’s any disparity, even accidental or due to demographics and targeting alone, politicians could go after Facebook’s “bias,” even if it’s entirely unintentional. And why shouldn’t they? Does accidental bias have any functional difference from intentional bias? Not to the election.

Advertisers Gaming the System

Another potential issue is the possibility of advertisers gaming Facebook’s ad network. They could get a peek into the data used and collected by other companies, getting access to data Facebook doesn’t sell. An advertiser could use this to maximize the spread of their ads to interested buyers without spending as much. They could also discover how their competition is advertising. Facebook doesn’t want their advertisers to have more information than they’re letting them have. At least, not for free, and not with consumers watching their personal information closely.

User Revolt

Facebook could get in a lot of trouble with their users if they’re leaking personal information again. Users may leave, Facebook will again lose the trust of more advertisers, users, and politicians, and lawmakers may decide to take action against the company. The fact is, Facebook has a lot to lose if their advertising information can get out.

Unfortunately, we have a lot more to lose if we don’t have this information.


Sources:
Exit mobile version