For months, Getty Images has mumbled its simmering resentment over its photos being used for AI image generators. Now, the stock image site has finally turned up the heat on one of the companies crafting these AI systems.
In a lawsuit filed last Friday and made public Monday, Getty alleged Stability AI, the company behind popular AI image generator Stable Diffusion, has stolen 12 million of Getty’s copyrighted images, along with their captions and metadata “without permission” in order to “train its Stable Diffusion model.”
“Stable Diffusion at times produces images that are highly similar to and derivative of the Getty Images proprietary content that Stability AI copied extensively in the course of training the model,” the lawsuit reads.
Getty Images is asking the court to make the London, UK-based Stability AI remove the violating images and pay up to $150,000 “for each infringed image,” alongside other damages for violating copyright law. Of course, if Stability AI were found to have violated copyright on the 12 million images and was fined the max amount, the damages would be obscene, though it would be hard-pressed to prove all those violations. Getty did include a list of more than 7,000 images plus metadata and copyright registration the company said was used to train Stable Diffusion.
Gizmodo reached out to Stability AI for comment, but we did not immediately hear back. Getty Images has already started similar legal proceedings in UK court.
Stable Diffusion is trained on the LAION dataset, an open source project that scraped billions upon billions of images from the internet for use in these AI generators. Although LAION is over 380 TB, the actual image generators are much smaller in size. Although these systems are supposed to create novel images using the trained data, that doesn’t mean they’re incapable of outright copying the style of photographers and artists, or even copying the work itself. The lawsuit references a study published last week by researchers working at Google, DeepMind, and in academia. These researchers proved that diffusion model AI image generators are capable of recreating images from its training data.
The Seattle-based stock image site notes that several images generated by Stable Diffusion contain the Getty watermark seen when viewing a photo before downloading the license to use the image. This is due to how diffusion model AI generation works. These systems deconstruct images by adding noise onto the image itself. The system then de-noises the modified image, adding the distinct aspects into its lexicon of trained data. If multiple images of a soccer match contain the same Getty watermark, the system will interpret that logo as an integral aspect of the final product when a user types in a prompt for a soccer match.
The lawsuit further claims that Stability AI knows its AI art generator creates this distorted Getty watermark and other watermarks “but it has not modified its model to prevent that from happening.”
As noted by The Verge, UK-based Intellectual Property researcher Andres Guadamuz wrote on Twitter that logos aren’t copyright, but more importantly the stock image site can claim these images hurt Getty’s brand. While Stable Diffusion 2 is much better at not reproducing watermarks, the still-available earlier version was much more prone to the practice.
Last year, Stability AI raised $101 million in funding from major venture capital firms, and has been valued at $1 billion, according to an anonymous source quoted by Bloomberg. Getty alleged this was “on the back” of Getty’s copyrighted materials alongside millions upon millions of other copyrighted examples.
Getty isn’t the first to sue AI image generator companies over alleged copyright infringement. Three artists have also filed a proposed class action lawsuit against Stability AI, as well as fellow AI art generator Midjourney and DeviantArt, over claims their AI art generators violated copyright laws. Experts have told Gizmodo that generative AI will continue to expand over the course of this year, and these lawsuits have the potential to steer where the technology may go.