How artists fight generative AI for images



summary
Summary

OpenAI’s latest image AI, DALL-E 3, is a big step forward because the system can better follow prompts and generate well-matched images in many styles. This adds to the existential fears of some graphic designers and artists.

OpenAI does allow artists to remove their images and graphics from the training material. However, this only applies to the training of the next model.

Moreover, this measure would only be effective if so many artists refused to use the AI models that the quality of the technology suffered significantly.

Artists have complained to Bloomberg that the opt-out process is cumbersome and resembles a “charade.” OpenAI is not disclosing how many artists have complained. It’s too early for that, a spokesperson says, but OpenAI is gathering feedback and looking to improve the process.

Ad

Ad

What is left for artists: Lawsuits or sabotage

Artists who want to fight generative AI for images have only two options: they can hope that international courts will uphold the numerous copyright claims and hold the model providers accountable. But the lawsuits could drag on for years, and the outcome is anyone’s guess. There are no short-term solutions in sight.

A new movement is focused on sabotaging AI models. Approaches like Glaze implement invisible pixels in original images that trick AI systems into thinking the image has the wrong style. This turns a hand-drawn image into a 3D rendering, protecting the style of the original.

Nightshade, named after the highly poisonous deadly plant, works similarly. But here the manipulated pixels are actually meant to damage the model by confusing it. For example, instead of seeing a train, the AI system will see a car.

Poison Pill for AI Models

Less than 100 of these “poisoned” images can be enough to corrupt an image AI model like Stable Diffusion XL. The Nightshade team plans to implement the tool in Glaze, calling it the “last defense” against web scrapers that do not accept scraping restrictions.

Nightshade, which currently exists only as a research project, could be an effective tool for content owners to protect their intellectual property from scrapers that disregard or ignore copyright notices, do not scrape/crawl instructions, and opt-out lists, the researchers wrote. Movie studios, book publishers, game producers, and individual artists can use systems like Nightshade to create a strong deterrent against unauthorized scraping.

Recommendation

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top