AI Art

The Law Is Finally Coming For AI Art

I have a theory about AI and qualitative works: the only people excited about the prospect of AI taking over, or even getting a foothold, in artistic fields are the ones with absolutely no capabilities in these fields whatsoever.

Recommended Videos

Why the hell would I want to read or look at someone an algorithm created? Our society is already becoming isolated and antiseptic, we look at screens all day long, color is being stripped out of our shared spaces and there’s a sameness to modern society. Everything is starting to feel as if it’s controlled by an algorithm. Taking away the human component to the last holdout of society—art—seems truly nefarious to me. We need more human connection, not less.

So with that in mind, I’m very happy to tell you that a massive class action has been filed against Stability AI, Midjourney, and DeviantArt for DMCA violations, Right of Publicity violations, Unlawful Competition, and Breach of Terms of Service. YES! If I could jump into a vat of this and swim around in it like Scrooge McDuck, I would.

So what exactly does this all mean? Basically, Stable Diffusion is an AI tool that is alleged to have been built off the copyrighted work and unpaid labor of artists that it studied for its own algorithm. It’s further claimed that the above-named platforms knowingly allowed access to their database of artists with the purpose to build out the tool, without artists’ consent of compensation. From the lawsuit press release:

As alleged in the Complaint, Stable Diffusion is an artificial intelligence product used by Stability AI, DeviantArt, and Midjourney in their AI image products. It was trained on billions of copyrighted images contained in the LAION-5B dataset, which were downloaded and used without compensation or consent from the artists. If Stable Diffusion and similar products are allowed to continue to operate as they do now, the foreseeable result is they will replace the very artists whose stolen works power these AI products with whom they are competing.

Artists need to be paid for their work. It’s extremely messed up when an online community for artists like DeviantArt has a backdoor deal with an AI product that gives it free rein to artists’ intellectual property and does not compensate the artist or allow them to opt out of giving the AI tool access to their art. This is akin to a bakery renting a shop from a landlord, and overnight, the landlord lets someone come in and look at the bakery’s top-secret recipes because they want to create the world’s best bread recipe. Just because this all happened digitally doesn’t make the theft any less real!

Three badass women are leading the charge on this lawsuit as plaintiffs : Sarah Ander­sen, Kelly McKernan, and Karla Ortiz.

Genuinely can’t wrap my head around what the Silicon Valley techies were thinking on this one. Did they not think they’d get caught?! Make this make sense to me, please.

To give you a scale of the magnitude of the theft at play here, from the StableDiffusionLitigation.com website:

Even assum­ing nom­i­nal dam­ages of $1 per image, the value of this mis­ap­pro­pri­a­tion would be roughly $5 bil­lion.

That’s Twitter-buying-after-Elon-finally-realizes-it’s-over money right there! The magnitude of what these tech dummies stole is breathtaking. Fundamentally, Stable Diffusion could not exist without the copywritten art. From the StableDiffusionLitigation.com

Sta­ble Dif­fu­sion belongs to a cat­e­gory of AI sys­tems called gen­er­a­tive AI. These sys­tems are trained on a cer­tain kind of cre­ative work—for instance text, soft­ware code, or images—and then remix these works to derive (or “gen­er­ate”) more works of the same kind.

Hav­ing copied the five bil­lion images—with­out the con­sent of the orig­i­nal artists—Sta­ble Dif­fu­sion relies on a math­e­mat­i­cal process called dif­fu­sion to store com­pressed copies of these train­ing images, which in turn are recom­bined to derive other images. It is, in short, a 21st-cen­tury col­lage tool.

If someone were out there using sentences I wrote to string together new content and trying to pass it off as something innovative and new and make money off of it, I would first get a restraining order because that’s really weird, and then after that, I would be out for blood.

There’s something deeply icky about creating a tool that literally needs an artist’s labor in order for it to exist and, rather than get consent from the artist, you go to their digital landlord and strike a backdoor deal with them. Furthermore, this is a tool designed to replace the artist—and yet art created by an artist is the literal fuel that feeds this zombie AI tool’s engine. I don’t know what to make of this, or why this needed to have existed in the first place. The only problem it solves is for some Silicon Valley Tech weirdo who got rejected by someone who went to Art School and never got over it—perhaps this is their revenge.

The lawsuit is in its infancy, so there will not be a quick resolution here but the easiest way to support the artists is not to use Stable Diffusion, or you know, support AI art in general. Embrace the fact that, for now, there is a human who has created art you enjoy or the words you are reading (even if that person is me and I’ve had to edit out a lot of four-letter words in order to get this piece to publication). You’re welcome. That’s my gift to you, friend.

(featured image: DeepMind on Unsplash)


The Mary Sue is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Kate Hudson
Kate Hudson
Kate Hudson (no, not that one) has been writing about pop culture and reality TV in particular for six years, and is a Contributing Writer at The Mary Sue. With a deep and unwavering love of Twilight and Con Air, she absolutely understands her taste in pop culture is both wonderful and terrible at the same time. She is the co-host of the popular Bravo trivia podcast Bravo Replay, and her favorite Bravolebrity is Kate Chastain, and not because they have the same first name, but it helps.