I have a theory about AI and qualitative works: the only people excited about the prospect of AI taking over, or even getting a foothold, in artistic fields are the ones with absolutely no capabilities in these fields whatsoever.
Why the hell would I want to read or look at someone an algorithm created? Our society is already becoming isolated and antiseptic, we look at screens all day long, color is being stripped out of our shared spaces and there’s a sameness to modern society. Everything is starting to feel as if it’s controlled by an algorithm. Taking away the human component to the last holdout of society—art—seems truly nefarious to me. We need more human connection, not less.
So with that in mind, I’m very happy to tell you that a massive class action has been filed against Stability AI, Midjourney, and DeviantArt for DMCA violations, Right of Publicity violations, Unlawful Competition, and Breach of Terms of Service. YES! If I could jump into a vat of this and swim around in it like Scrooge McDuck, I would.
So what exactly does this all mean? Basically, Stable Diffusion is an AI tool that is alleged to have been built off the copyrighted work and unpaid labor of artists that it studied for its own algorithm. It’s further claimed that the above-named platforms knowingly allowed access to their database of artists with the purpose to build out the tool, without artists’ consent of compensation. From the lawsuit press release:
As alleged in the Complaint, Stable Diffusion is an artificial intelligence product used by Stability AI, DeviantArt, and Midjourney in their AI image products. It was trained on billions of copyrighted images contained in the LAION-5B dataset, which were downloaded and used without compensation or consent from the artists. If Stable Diffusion and similar products are allowed to continue to operate as they do now, the foreseeable result is they will replace the very artists whose stolen works power these AI products with whom they are competing.
Artists need to be paid for their work. It’s extremely messed up when an online community for artists like DeviantArt has a backdoor deal with an AI product that gives it free rein to artists’ intellectual property and does not compensate the artist or allow them to opt out of giving the AI tool access to their art. This is akin to a bakery renting a shop from a landlord, and overnight, the landlord lets someone come in and look at the bakery’s top-secret recipes because they want to create the world’s best bread recipe. Just because this all happened digitally doesn’t make the theft any less real!
Three badass women are leading the charge on this lawsuit as plaintiffs : Sarah AnderÂsen, Kelly McKernan, and Karla Ortiz.
Genuinely can’t wrap my head around what the Silicon Valley techies were thinking on this one. Did they not think they’d get caught?! Make this make sense to me, please.
To give you a scale of the magnitude of the theft at play here, from the StableDiffusionLitigation.com website:
Even assumÂing nomÂiÂnal damÂages of $1 per image, the value of this misÂapÂproÂpriÂaÂtion would be roughly $5 bilÂlion.
That’s Twitter-buying-after-Elon-finally-realizes-it’s-over money right there! The magnitude of what these tech dummies stole is breathtaking. Fundamentally, Stable Diffusion could not exist without the copywritten art. From the StableDiffusionLitigation.com
StaÂble DifÂfuÂsion belongs to a catÂeÂgory of AI sysÂtems called genÂerÂaÂtive AI. These sysÂtems are trained on a cerÂtain kind of creÂative work—for instance text, softÂware code, or images—and then remix these works to derive (or “genÂerÂate”) more works of the same kind.
HavÂing copied the five bilÂlion images—withÂout the conÂsent of the origÂiÂnal artists—StaÂble DifÂfuÂsion relies on a mathÂeÂmatÂiÂcal process called difÂfuÂsion to store comÂpressed copies of these trainÂing images, which in turn are recomÂbined to derive other images. It is, in short, a 21st-cenÂtury colÂlage tool.
If someone were out there using sentences I wrote to string together new content and trying to pass it off as something innovative and new and make money off of it, I would first get a restraining order because that’s really weird, and then after that, I would be out for blood.
There’s something deeply icky about creating a tool that literally needs an artist’s labor in order for it to exist and, rather than get consent from the artist, you go to their digital landlord and strike a backdoor deal with them. Furthermore, this is a tool designed to replace the artist—and yet art created by an artist is the literal fuel that feeds this zombie AI tool’s engine. I don’t know what to make of this, or why this needed to have existed in the first place. The only problem it solves is for some Silicon Valley Tech weirdo who got rejected by someone who went to Art School and never got over it—perhaps this is their revenge.
The lawsuit is in its infancy, so there will not be a quick resolution here but the easiest way to support the artists is not to use Stable Diffusion, or you know, support AI art in general. Embrace the fact that, for now, there is a human who has created art you enjoy or the words you are reading (even if that person is me and I’ve had to edit out a lot of four-letter words in order to get this piece to publication). You’re welcome. That’s my gift to you, friend.
(featured image: DeepMind on Unsplash)
Published: Jan 19, 2023 07:15 am