Fast company logo
|
advertisement

The first few suits against the vast vacuum model used to train platforms like Midjourney and Stable Diffusion could signal a difficult road ahead for generative AI.

Can a string of lawsuits slow down generative AI?

[Photos: SIMON LEE/Unsplash; Sincerely Media/Unsplash]

BY Chris Stokel-Walker3 minute read

The rise of generative artificial intelligence tools has wowed the world with its ability to conjure up images and text from little more than a simple prompt by users. The tech has also caused consternation among artists, who fear their work is being used to supply the powerful neural networks with a library from which it might pull.

Now the artists, and the companies that host their work, are starting to fight back.

On January 17, stock and editorial image platform Getty Images announced it was threatening to launch legal proceedings against Stability AI, the company behind a text-to-image generator, claiming in a letter prior to action it “unlawfully copied and processed millions of images protected by copyright and the associated metadata owned or represented by Getty Images absent a license.” Should Stability AI continue distributing Getty content, the photo platform could take the tech firm to court in the highly litigious U.K. system.

The threat of a case is the second this week launched against the budding AI tech. On January 13, Stability AI was one of three defendants—alongside Midjourney and DeviantArt—listed in a class action suit alleging that their generative AI tools “violated the rights of millions of artists” by using their artwork to train their models without any compensation or recognition of their rights. (Crucially, that’s different to outright copying the content.) The complaint, filed in the Northern District of California, seeks a jury trial on behalf of the three plaintiffs, a trio of artists.

Could this be the start of a potential reckoning for generative AI? “In these cases, I think there are some quite interesting challenges being posed to existing copyright law,” says Lilian Edwards, professor of law, innovation and society at Newcastle University, who has been studying the early development and adoption of generative AI tools like Midjourney, Stable Diffusion, and ChatGPT. “I’m not sure that generative AI is posing any particularly profound challenge to the law in the areas of fairness, discrimination, bias, and transparency”—which could make it subject to the EU’s long-planned AI Act—“but it is kind of upsetting the applecart a bit with copyright.”

The risk to artists from AI has long been known. Artists have been annoyed by people developing children’s books using generative AI tools and by making schlocky fictional screencaps of movies by their favorite directors on the internet. Artists have even launched an organization designed to protect their rights in the AI era. But the courts are new territory, and a potentially stronger weapon to try and tamp down on any co-opting of their work.

But Edwards, for one, is pessimistic about the likelihood of success of the class action suit, saying that it seeks to try and rewrite copyright law. “Copyright law is the right to stop people making copies, right?” she says. “It’s not the right to stop people imitating your style.”

That’s dangerous, says Andres Guadamuz, reader in intellectual property law at the University of Sussex, because the case—and likewise the battle brewing between Getty and Stability AI—have the potential to be a test case for AI training. “While there is some related case law, there is nothing specifically on the training of an AI,” he says.

Guadamuz adds that Getty would stand a stronger chance of succeeding—not least because it could point to the fact that a competitor, Shutterstock, has agreed to a partnership with OpenAI that shows AI-generated content and stock image sites can get along together. Yet Guadamuz believes that the case is not likely to make it to court, saying it seems more designed to bring Stability AI to the negotiating table to discuss licensing fees. (A Stability AI spokesperson says they’re reviewing the documents and will respond accordingly. “Please know that we take these matters seriously,” they add. DeviantArt and Midjourney did not respond immediately to a request for comment about the class action suit they are named in.)

That hits on a broader rationale behind the cases, and their ultimate purpose. “So much of these class actions is going to be about strategy,” says Edwards. “What court you get, what evidence you bring, and at what point you settle. Is it intended to produce a remedy or is it just saber-rattling?”

Even if these first two cases don’t go anywhere, they’re likely just the beginning. “This is probably not going to be the only lawsuit,” says Guadamuz. And such legal action could winnow down, at least to an extent, the generative AI competition: Smaller competitors, bogged down by lawsuits and lawyer fees, could close up shop. (It’s notable that OpenAI has avoided such litigation, perhaps owing to its Microsoft connection.) 

That’s not to suggest these suits pose any sort of existential threat to the tech. “The genie is out of the bottle,” says Guadamuz. “Generative AI isn’t going anywhere.”

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the final deadline, June 7.

Sign up for Brands That Matter notifications here.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Chris Stokel-Walker is a freelance journalist and Fast Company contributor. He is the author of YouTubers: How YouTube Shook up TV and Created a New Generation of Stars, and TikTok Boom: China's Dynamite App and the Superpower Race for Social Media. More


Explore Topics