Turning AI Against Itself: Nightshade Is A Free AI 'Poisoning' Tool That Aims To Protect Artists

Zinger Key Points
  • Nightshade is a free tool that “poisons” data for AI image generators, preventing them from replicating artists’ work.
  • The free tool’s “poisoning” of artwork is not visible to the human eye.

Artists now have a free tool to fight back against AI thanks to a new tool called Nightshade, which "poisons" data to make art irreproducible by image generators.

What Happened: Artists now have a way to prevent unauthorized usage of their copyrighted images by AI image generators by employing a technique known as "data poisoning."

This method, facilitated by a free tool called Nightshade, subtly alters the pixels of an image such that it remains unchanged to the human eye but causes chaos for computer vision.

See Also: Could This Be The Future Of Entertainment? Disney Shows Off ‘HoloTile’ Floor For A 360-Degree VR Experience

When these "poisoned" images are scraped by tech firms to train their AI models, the data pool becomes "poisoned," leading the algorithm to misclassify images.

This causes unpredictable and unintended results from the AI model, leading to disruptions in the generator's functioning.

The more "poisoned" images there are in the training data, the greater the disruption. The tool's creator hopes that this will force tech companies to respect copyright laws. However, there are concerns about the potential misuse of this tool to purposely disrupt the services of these generators.

Nightshade has been developed by computer scientists at the University of Chicago, led by Professor Ben Zhao.

Why It Matters: This retaliation comes in the wake of increasing concerns about the misuse of AI image generators.

In November, actress Scarlett Johansson sued an AI generator over the unauthorized use of her likeness and voice in deep-fake images.

Earlier in December, Meta Platforms Inc. META launched a new AI image generator, Imagine with Meta, that generates high-resolution images based on text prompts.

This development, along with similar offerings from Microsoft Corp.-backed META OpenAI, has raised concerns about the potential misuse of such technologies.

"Data poisoning" could be one of the tools in the armory of artists to disrupt the functioning of these AI models.

Check out more of Benzinga's Consumer Tech coverage by following this link.

Read Next: Elon Musk Tells Tim Cook He’s Looking Forward To ‘Trying’ Apple Vision Pro Hours After Taking A Swipe At The Mixed Reality Headset

Disclaimer: This content was partially produced with the help of Benzinga Neuro and was reviewed and published by Benzinga editors.

Representational photo created using Dall-E 3

Market News and Data brought to you by Benzinga APIs
Posted In: NewsTechartificial intelligenceChatGPTConsumer TechDALL-EDall-E 3NightshadeOpenAi
Benzinga simplifies the market for smarter investing

Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.

Join Now: Free!

Loading...