Proposed Bill Offers Some Protection from Generative AI

Reading Time: 4 minutes.
A robot behind bars

I’m playing with different styles. My “art” is… a work in progress, but it’s better than AI! I learn by doing, by making, by asking questions and getting feedback. Collaboration. Work. AI, on the other hand, learns with theft.

To me, it seems so obvious when you break it down: AI is theft. Take a work, use other works to reconfigure something, biased towards a result, and produce something similar to the original. Small amounts of automated copyright infringement, so vast and spread so wide you have to break out a microscope (or a long log) to work out what pieces had been stolen to make the output. Unfortunately, the law hasn’t caught up to the technology. Cases haven’t lead to rulings against AI companies taking data, artistic creations, leaked emails, and other information without permission or compensation.

But a new bipartisan bill in congress could mean American creators will have just a small amount of additional protection against AI theft. AI users will even be able to better flag the content they have AI generate so that others will know it’s AI-generated, potentially even who generated it and where it’s from. From there, no one will be allowed to share content with attached metadata, attributions, or watermarks removed.

Seeing as AI has literally attempted to copy watermarks, that last part will certainly matter to many. However, the law may not go far enough.

COPIED Act

“The bipartisan COPIED Act I introduced with Senator Blackburn and Senator Heinrich, will provide much-needed transparency around AI-generated content.”

– Senator Maria Cantwell (D-WA)

The Content Origin Protection and Integrity from Edited and Deepfaked Media Act is a mouthful. It’s abbreviated as the “COPIED Act.” The bill would force AI companies to add tools to allow those using their services to add watermarking, metadata, or other provenance information. They’d have to add this optional watermarking within two years of the bill passing. It would not require that this metadata is attached to anything AI generates, only that the user could add it, if they choose to. The bill would also prevent any entities from removing this provenance information.

Essentially, it’s a bill to allow and protect forms of digital watermarks. The bill would empower the Federal Trade Commission to enforce “unfair and deceptive practices” laws if a company or individual altered provenance information. It would make removal of this information akin to plagiarism, just like stripping attribution does in any other medium.

The National Institute of Standards and Technology would create guidelines for this provenance information, and companies making generative AI tools would have to allow users to add the data from those guidelines to any generated content they request from the service.

What It’s Missing

“The COPIED Act takes an important step to better defend common targets like artists and performers against deepfakes and other inauthentic content.”

– Senator Marsha Blackburn (R-TN)

A creepy robot head with sunken features. Digital painting on digitally produced static

Okay, I am a little proud of this creepy robot I painted with Procreate though. Can AI make a creepy dismembered robot head? Sure. Can it do it with the raw anxiety of a human worried about a future where humanity itself is obsolete? Probably not.

 

The biggest flaw is that this does not enforce the labeling of AI. It only allows people who wish to add provenance information to do so, and then protects that information. However, most people using generative AI don’t want attribution. They know anyone with the ability to type a prompt into a box could get the same thing. They’re no artists, no creatives. They don’t care about what they asked some copy machine to generate just as we don’t care about the output.

Do you really think the person making potentially illegal deepfakes of Taylor Swift or Joe Biden (likely for very different reasons) wants their name attached to it? No! That’s why this metadata must be enforced. Companies should be forced to sign the crap they’re making. It would be easy to track down people making deepfakes of politicians or AI-generated non-consensual porn of celebrities if their username or actual name, and the tools they used to generate those deepfakes, were intrinsically linked to the data. We need tools that embed this data in images, called steganography, as well as metadata tagging. Force companies to claim the garbage they’re forcing on us. Maybe then they’ll have to take more responsibility for the misuses of AI they allow.

And, of course, this bill doesn’t even begin to touch on consent. Artists, journalists, software engineers, social media users, YouTubers, we’re all still contributing to AI, whether we like it or not, because our creative works are being stolen from us to generate this shit.

What Happens Next

“The capacity of AI to produce stunningly accurate digital representations of performers poses a real and present threat to the economic and reputational well-being and self-determination of our members,”

– Duncan Crabtree-Ireland, national executive director and chief negotiator of SAG-AFTRA

The bill has a better chance than most do in our inefficient and partisan two-party system of governance. It’s already starting with bi-partisan support, from two Democrats and a Republican. It also includes the Senate Commerce Committee chair and committee member, as well as a member of the Senate AI Working Group. Furthermore, it has large support outside of congress. SAF-AFTRA, the RIAA, News/Media Alliance, National Newspaper Association, and many other groups who are actively being harmed by AI currently have come out in support of the bill. It may not do enough, but it does more than nothing, and it’s a good first step towards protecting real works of human-made art.

The bill has not had support from companies making AI products using our works without our permission or compensation. Crooks rarely support the bills that make their unethical activities more difficult. Unfortunately, they also have a significant lobbying group, and include giants like Microsoft, Google, Meta, and Apple.

These laws should be simple to write and enforce. This is not a complicated matter. However, politicians have been unprepared to tackle machine learning and generative AI issues. They are either incapable of dealing with the problem, or willfully ignorant of it. It’s easy to stay ignorant of an issue with lobbyists enticing you to look the other way. This bill is a small first step, but it could be the first serious step taken to undermine the power AI companies have right now, the first step towards holding them accountable for stealing our labor for their own profit. Let’s hope congress follows through and takes that first step, the first of many we’ll need to protect human contributions to art and business.


Sources:
,