Demonetization Doesn’t Work: How Problematic YouTubers Still Make Money

Reading Time: 4 minutes.

The YouTube logo with a dark, omimous lookIf you’re on YouTube much, you’ve surely heard the typical, “Like and subscribe, and don’t forget to click that bell!” or something to that effect. You likely also heard of other ways YouTubers try to make money as well. The most common is likely Patreon, often called, “The best way to support the channel.” YouTubers may also push merch, printed T-Shirts or partnerships with companies. The latter, especially, turns up in videos. When was the last time you saw a YouTube video from a popular streamer that didn’t mention a VPN service, Raid Shadow Legends, Raycons, Helix Mattresses, or Skillshare? Some companies advertise with nothing more than an affiliate link, so anyone can be “sponsored” by them with a link that will get them a portion of sales.

It turns out that, for problematic users on YouTube, that is, the alt-right, these alternative payments are the primary means of making money. Often, it’s their only means of making money, and it’s quite profitable. Dangerous YouTubers spread misogyny, bigotry, and hate on the platform, and YouTube’s moderation tools don’t have an effect. If YouTube’s only tool is demonetization, and bad actors have long since given up on getting money from YouTube, what’s the point? Has YouTube made a perfect storm of rule breaking? By not paying content creators enough, they’ve made demonetization a joke, and have pushed their platform towards content they would rather not have. Demonetization isn’t the threat it once was, and it could push YouTube further to the right.

Popular with the Problematic

YouTube doesn’t pay enough to pay the bills. That’s why many content creators on YouTube eventually turn to alternative financing options, especially if they want to make a full-time job of content creation. These methods include collecting donations through PayPal, setting up a Patreon with exclusive content for subscribers, or collecting cryptocurrency donations. Creators will also often turn to affiliate marketing. You’ve definitely seen this. It often includes a mention in the video for a product or service with a special link for a discount or some bonus. These ads basically pay per purchase, so they’re only successful on larger channels. Still, it can provide a revenue stream. Some creators turn to branded merchandise. They’ll make a logo or some T-shirt designs and sell them online.

While many creators do turn to these methods, more casual creators are more likely to just accept YouTube’s ad money. However, YouTube uses their monetization as a poor excuse for actual moderation. YouTube will often remove monetization from videos or users for lousy reasons, like they shared LGBTQ+ content, or for legitimate reasons, like the user shared hateful alt-right content. When YouTube demonetizes, creators find another way to make money. Perhaps that’s why, disproportionately, alt-right and misogynist content is more likely to feature external links and financing.

Cornell Tech in collaboration with the Swiss Federal Institute of Technology Lausanne studied revenue sources for different types of channels. They found that 68% of channels use at least one external financing method. That break down skews far to the problematic. While only 28% of normal accounts use off-platform financing, problematic, that is, alt-right, alt-light, or misogynist content creators, will use these external sources of funds in 48% of cases. Perhaps due to YouTube’s usage of demonetization for moderation, the worst of YouTube’s platform has found ways to make money elsewhere.

An Inability to Moderate

The people YouTube seeks to punish with demonetization aren’t using YouTube’s ads anyway. If YouTube doesn’t ban bad actors, nothing changes. These bad actors aren’t making money through YouTube ads anyway, losing them isn’t a motivator.

“We argue that moderation through demonetization is not likely to be an effective tool in disincentivizing the production of problematic content, and may even result in a shift of content produced towards committed audiences”

– from the study

This study found that after a channel finds an external income source, they’ll generate 43% more content. In other words, YouTube demonetizing a problematic user may actually push them to create more problematic or hateful content and make more money from non-YouTube sources.

YouTube does have an external links policy, but doesn’t seem to use it for moderation. After demonetizing a problematic user, YouTube doesn’t go back and ban external linking. This would be hard to do as promo codes are often in the video, not just the description. Still, YouTube could make it harder for far-right users to take advantage of their platform to spread hate and encourage violence.

YouTube needs to strengthen their moderation. Demonetization and removal of external links is a good first step that YouTube could implement with little effort. The next step would be banning users and, when an infraction isn’t enough for an outright ban, mentioning on the video why the user had been demonetized. This could help users figure out which creators they should avoid.

Patreon, Sponsors, & Merch Creators Claim Responsibility Too

Google’s YouTube isn’t the only company to blame here. External websites are still funneling money into these extremist projects. Part of the issue is YouTube’s lack of transparency. They could make it more obvious that a creator has broken their rules so linked websites can follow suit. Still, if reported, Patreon, merch retailers, sponsors with affiliate links, and other external sources of income could cut ties with the hateful content creators. It doesn’t always happen. Some of these income sources may not have a way to report a problematic affiliate and some may not want or care for reports. Meanwhile, others could want to ban hateful users from their platform, but don’t know what’s happening off their platform. This is why the bulk of the blame still comes back to YouTube, who decides to allow these creators to stay on their platform, rather than banning them indefinitely.

YouTube holds the blame and the power here. They could chose to moderate their platform with effective means. However, by not doing so, they retain a monopoly on user-generated video content. They haven’t forced creators to find or create more hate-friendly platforms. Because of that, YouTube is still a one-stop spot for cute videos, instructional videos, relaxing painting videos, project guides, and, of course, violence-inspiring bigotry in all of its many forms.


Sources:
  • Yiqing Hua, Manoel Horta Ribeiro, Robert West, Thomas Ristenpart, Mor Naaman, via Arxiv.org
  • Mia Sato, The Verge