Musician has Her Own Work Stolen by AI Grifter

Reading Time: 10 minutes.
And Spotify, YouTube, and Vidya only made things worse
A Spotify-like robot sings a song. It looks bad. It's about money?

You can tell I don’t use AI because my doodles suck, and they will always suck.

Murphy Campbell plays folk music with a focus on the sound of the Appalachian mountains in Western North Carolina. She plays classic folk songs and has four songs—as of this writing—on her Bandcamp profile. You can find more live recordings of her on YouTube, but those four songs are the only ones she’s released in full.

You wouldn’t have known that from checking her out on Spotify a few weeks ago. There you could find more songs supposedly by her.

Sometimes this happens. Distributors might help an artist release on platforms like Apple Music, Qobuz, and, yes, Spotify, but artists sometimes keep Bandcamp to themselves. However, that’s not what happened here. Instead, someone uploaded music to streaming platforms in Murphy Campbell’s name. It wasn’t her music. She had live performances of those songs on YouTube though, and the AI fakes were used to make copyright claims against her own music.

This began a months-long journey to reclaim ownership of her own music and get the fake songs taken off her profile. When I started researching this article, I could still find AI tracks in her name on Spotify. The popularity of articles about this seemed to have finally motivated Spotify to take them down, but not for weeks after news started to spread.

We’ve all had to worry about AI replacing the work of humans, or at least, our oligarch bosses using it as an excuse to lay us off when projects slow down and pay us less when they need us, but an automated system that allows trolls and bots to steal ownership of your own work feels like yet another escalation in AI’s war against mankind.

Murphy Campbell Locked Out of Her Own Music

Screenshot showing the fake Murphy Cambell next to the real one

Spotify has removed the fakes since this screenshot was taken in early April. Screenshot via spotify.com

AI inserting tracks into an artist’s collection on various platforms isn’t anything new. Many artists have had their own issues with AI, and streaming platforms, especially Spotify, have been less than helpful in removing them. This is a consequence of allowing AI “music” on your platform at all, let alone not labeling it automatically. It’s also a problem with platforms that currently don’t ask an artist to have the final say on what ends up on the platform in their name. That’s every streamer right now.

In January of this year, Murphy Campbell found music uploaded to streaming platforms that almost sounded like her sound, but was off. The instruments, the vocals, everything just sounded off, and in some cases, horrible, like it wasn’t even a real instrument. She suspected AI, which The Verge was able to verify with AI detection software. Unfortunately, distributors and streamers alike are not interested in doing the same to filter out AI.

Some asshole decided to train AI on her live performances and upload AI-generated copies of those songs. Obviously they sounded terrible, both using a live performance as a base instead of a studio recording and being AI-generated. But that didn’t stop anyone from uploading them, marking them as Murphy Campbell’s, and claiming copyright of them.

Vydia had uploaded AI generated music and videos on behalf of the fraudster impersonating Murphy Campbell. In doing so, they claimed copyright of them, and YouTube did the rest. Suddenly, Campbell’s live performances were declared no longer her own, belonging instead to the fraudster. AI had not only stolen her work and posed as her, it made it difficult for her to profit from her work in the future. It’s a major setback for musicians who already struggle to make ends meet in today’s society that undervalues music and the people who make it, treating it like a commodity instead of one of the most important defining characteristics of humanity.

Vydia has since banned the creator and released all ownership back to Murphy Campbell, far more than Spotify had done in that time. This should never have been possible in the first place. The claims they made were for public domain songs, they shouldn’t have even been seeking ownership of anything related to them, but the grifter who made the AI songs likely mislabeled them. Furthermore, this wouldn’t be possible with proper regulations and protections against such AI fraud. We can prevent this, companies just choose not to.

Months into this ordeal, Spotify still had one of the fake AI songs, “The Four Marys” up under a new artist profile with the same name as Murphy Campbell. It showed up in search results, even days after this story broke. You can find all of Murphy Campbell’s actual music on Bandcamp or Apple Music though, so maybe it’s time to stop using Spotify and try a platform that isn’t adversarial towards the artists that make it possible?

Nearly All Musicians Dealing with AI Issues

Mannequin Pussy is a punk band from Philadelphia. They’re good, you should check them out. They’re currently touring, so if you can, try to see them live. A few weeks ago, during Women’s History Month, Spotify highlighted frontwoman Missy Dabice on a Times Square billboard for “Women of Punk.” In true punk fashion, Dabice took the opportunity to point out the issues with AI in music, specifically Spotify’s role in that.

“I’d love to start having a real conversation with whoever I can at the company about what you plan to do about AI fraud on the platform, the proliferation of how non artists can take advantage of the lack of regulation on the platform and how they are contributing to the increased potential that music streaming sites such as yours are targets of cultural grifting.”

– Missy Dabice, of Mannequin Pussy, on Instagram

Grace Mitchell’s been on a few of my playlists for some time. She’s a singer songwriter with an alternative sound. She points out that “it does make sense to me why they’re targeting small artists, because we have less resources. But it doesn’t make any sense because it’s not lucrative.” These smaller artists can’t survive on their own music. Spotify pays around half of what Apple and others pay artists, and none of them pay a livable wage for artists’ work.

Liso Hrescko is the COO of the indie record-label trade group A2IM. She says AI fraud cases are “incredibly prevalent,” going further to say that artists are “fortunate if you have not been subject to it at some level.” It’s disturbing to know the frequency of this, especially targeting indie artists who don’t have a team supporting them, has reached such a pervasive level.

“It takes weeks to get this stuff down, but even in those weeks, the streams that it generates wouldn’t even be enough for — it’d be a dollar!”

– Grace Mitchell, musician

That’s exactly why this kind of fraud is so prevalent. An AI slop maker can’t make enough money off their own slop or even their fraudulent tracks posing as another artist. That’s why they make so much. They spread it around on dozens of artists’ pages. That’s all passive income for them. A well-designed AI agent wouldn’t even require much work to upkeep as it generates slop based on indie deep cuts, knowing indie artists will never be able to take it down before they can profit from it.

Mike Smith, a man in North Carolina, plead guilty to streaming fraud, having allegedly made $8 million with fake streams and hundreds of thousands of tracks. Once you get a machine up and running generating and uploading slop, it’s easy to just let it run 24/7.

A Pale Imitation, a Damaged Reputation

Many artists found the fake music AI fraudsters uploaded music that sounded similar to their genre, but not close to their actual signature sound. Veronica Swift is a jazz musician who has been fighting what she says is not her song, “Sweet Smile,” which was under her name. It used an AI-generated image that looks similar to her as well as her name, but the music was lousy AI imitations of jazz. At that point, it feels like defamation. To claim an artist would make such junk must be a crime. Instead, she’s been fighting to have it removed for months. Since I started writing this piece, it was still here.

Clover Country had an entire album of slop released under her name. She was embarrassed to see it. Paul Bender, bassist for Hiatus Kaiyote, worked for weeks to remove slop from streaming imitating his side project, The Sweet Enoughs. When he’d get one finally taken down, it would immediately be back up. Whack-a-slop.

Deezer says they receive 50,000 suspected AI-generated tracks daily. No wonder artists can’t keep up. The problem isn’t fixable from inside these networks, it has to come from keeping this garbage off the platform to begin with. The problem is, streaming companies just don’t care. In a bid to test just how easy it is, artists banded together to make a series, “Operation Clown Dump.” They uploaded garbage songs credited to other artists in the collective. Not one got turned away, despite being obvious fraud. No one was trying to stop it. No where else is it so easy to steal someone’s identity.

Distributors and Streaming Companies Not Incentivized to Make Change

For distributors and streamers alike, the goal is to accept as much music as possible. That’s why they’ve been reluctant to block AI-generated “music.” But doing so is such a short-sighted move. The market is moving against streaming. iPod sales are on the rise, same with vinyl, CDs, and, yes, even cassettes. People are turning to music they can find on ethical streaming services. If these distributors and streamers become loaded with slop, people won’t just sit idly by and enjoy the trash shoved into our ears, we’ll listen to music elsewhere.

But companies often make short-sighted decisions. Investors often want infinite growth, and only respond to responsible decision-making after it’s already too late. We will need more than empathy or even prudent decision making skills to save music, we’ll need laws.

Distributors like DistroKid, TuneCore, Vydia, and others, need to put AI protections and fraud prevention in place. This involves having artists make verified accounts and ensuring uploaded music does not include any details of existing verified artists without them signing off on it. Surprisingly, Spotify is working on such a system, and may start rolling it out to more artists soon. All distributors and streamers should have systems like this in place that would prevent fraud, and humans need to be able to respond to fraud claims, not bots.

On top of that, they need to take a stand against 100% AI-generated music. Yes, some AI will go into the production of music. Using AI as a tool to make your art is questionable, but not something we should outright punish. However, forbidding the upload of songs that are nearly entirely generated by AI, perhaps more than 50% AI, should be shut down immediately, requiring further verification, specific labeling, or, preferably, banning.

Get that shit out of my headphones.

Platforms need to allow artists to label their music as being free of AI. That label would allow artists to forbid any AI-generated slop from being labeled with their name. It would allow artists to permanently opt out of slop and show their listeners they’re dedicated to making real music. And if platforms are too cowardly to ban AI music, they should at least automatically label suspected slop.

Spotify’s “Artist Profile Protection” is in beta, and will roll out to more artists. Other streamers will likely do the same, requiring a verified artist to sign off on any new songs uploaded in their name. However, Bandcamp is the real shining example here. They turn away all AI. That’s the goal we should head towards. Not just verifying artists, because this could still flood our services and playlists with AI slop. We need a full ban of this shit.

Companies may not do it, but if they are found liable for hosting fraudulent tracks they could have prevented, then perhaps it would hurt their bottom line enough that they’ll listen to consumers. We need laws requiring services verify the source of materials uploaded to their platforms if it’s in someone else’s name, and we need to hold them liable for hosting fraud or attacking artists with copyright notices for their own music.

AI Music Generators Make this too Easy

The Verge investigated just how easy it was to use AI music generators like Suno to make fraudulent copies of other’s art. While Suno claims they have copyright protection, it clearly isn’t done on the final project or strong enough on the generation steps either. With just a few tweaks, The Verge was able to make lousy covers of Beyoncé’s Freedom, A gibberish version of Black Sabbath’s Paranoid, a sloppy clone of the Dead Kennedys’ California Über Alles, and others. They found it was easy to get the AI to reproduce the sound of the original without even asking it to, though newer models do tend to take “liberties with the source material.” they found they needed only minor changes to a word or two to get the song generator to make clones.

These cloned songs could be played royalty-free in locations, stripping the artists of the money they’d make from those plays. They’re used by fraudsters. These tools must be regulated as well. AI can’t have the freedom to violate copyright without the companies making it possible and easy held responsible.

Unfortunately, not only has the law lagged behind common sense and decency, allowing AI companies to pilfer our creations for profit without consequence, the law has also been slow to punish them for the output violating copyright or being used for fraud as well. If a company doesn’t take appropriate measures to prevent copyright abuse and fraud, they need to be held accountable to the fullest extent of the law. We need new laws to ensure that extent is an appropriate punishment for the crime.

Streaming Platforms and Distributors Need to be Held Accountable

“I’m in this weird limbo where I’m telling robots to take down music robots made”

– Murphy Campbell, musician

Obviously what has to happen is platforms and distributors need to take a stand against AI-generated music. There are detectors that could make this easy and, frankly, many AI songs are obviously AI-generated. We have proof that it is negligence and greed alone that leads to this problem. That should make them liable, but we may need clearer laws to make that so. On top of that, they could ask everyone who uploads music to self-report if they do allow it, and ban those who hide their use of AI. Finally, they need to immediately report fraudsters to authorities so they can be punished quickly. If the risk of defrauding artists and platforms was higher, people wouldn’t do it.

These companies need to be faster at taking down fake music. If the artist reaches out and asks to have music imitating them taken down, not doing it is should be considered akin to defamation and aiding in the fraud. Putting out AI music in someone else’s name feels like leaving a flaming bag of shit on someone’s porch and blaming your neighbor. It seems libelous. Claiming a real artist created that garbage is offensive. No one should have their name besmirched like that.

It should never take more than an email to remove deepfakes, fraud, and slop in someone’s name. These artists have had to work months, arguing with bots to get their imitators taken down. No one should have to defend themselves against the fraud that these companies have enabled and generated, and they need to be held responsible for making it easier to do fraud than to remove it.

YOU Are Accountable

“There’s just no fucks given. Can this industry get any more grim? It’s just a constant avalanche of disappointments.”

– Paul Bender, Hiatus Kaiyote bassist

Yes, you are accountable for this too. Are you using AI? Generating anything? Having “conversations” with a sophisticated autocorrect that stole the work of billions of people, wastes millions of gallons of water from drought-stricken communities, dries wells, requires slave labor, and makes communities unlivable just to churn out some worthless dribble? Then you’re to blame every time shit like this happens.

You created demand for a product that hurts people and steals our works to function. You continue to use a product that damages the environment, destroys communities, requires what workers are calling nearly slave labor, and likely creating conditions for worker exploitation like slave and child labor in countries mined for the extreme levels of resources they need to build AI infrastructure.

You had a hand in doing this. That’s what AI is, it forces everyone to contribute. Hell, I, even though I never use GenAI, contributed to it. That’s because our works of writing, art, music, code, all were stolen from us to make these soulless machines possible. But if you continue to use them, you’re more responsible for the atrocities they commit than anyone else, because you made them viable.

It also means you have the power to stop it. Abandon platforms that make artist’s life hell with low pay and numerous preventable AI issues. Stop using AI yourself wherever you can, and especially don’t use generative AI or chatbots.


Sources:
,