New Tool Defends Artists by “Poisoning” AI Image Generators

New Tool Defends Artists by “Poisoning” AI Image Generators In history, the nightshade plant was used to poison kings and emperors. So it's only fitting that a new tool used to poison AI art generators is named Nightshade. Created by Ben Zhao, a computer science professor at the University of Chicago, the tool is designed to help artists combat copyright infringement by AI art […] READ: New Tool Defends Artists by “Poisoning” AI Image Generators

New Tool Defends Artists by “Poisoning” AI Image Generators

ARE YOU TIRED OF LOW SALES TODAY?

Connect to more customers on doacWeb

Post your business here..... from NGN1,000

WhatsApp: 09031633831

ARE YOU TIRED OF LOW SALES TODAY?

Connect to more customers on doacWeb

Post your business here..... from NGN1,000

WhatsApp: 09031633831

ARE YOU TIRED OF LOW SALES TODAY?

Connect to more customers on doacWeb

Post your business here..... from NGN1,000

WhatsApp: 09031633831

New Tool Defends Artists by “Poisoning” AI Image Generators

Nightshade AI Art Tool

Photo: Olivier26/Depositphotos

In history, the nightshade plant was used to poison kings and emperors. So it's only fitting that a new tool used to poison AI art generators is named Nightshade. Created by Ben Zhao, a computer science professor at the University of Chicago, the tool is designed to help artists combat copyright infringement by AI art generators that are trained using their artwork.

Nightshade allows artists to inject an invisible pixel into their artwork before they upload it online. If that artwork is then placed into an AI training set, it will infect the AI model and cause it to break. The ingenious tool, which is currently under peer review but was previewed by the MIT Technology Review, could be the saving grace of artists who are rightly concerned about AI infringing on their copyright.

So what happens when an image is injected with Nightshade? Based on tests by the developers, the poison data manipulates the AI models. It can fool the system into thinking an image of a cat is an image of a balloon or that cakes are a toaster. This results in unusable output; and, once the uploaded image is infected, it's very difficult to remove. This means that tech companies will need to invest quite heavily in finding the infected samples in order to remove them.

Nightshade AI Art Tool

Photo: stockasso/Depositphotos

Currently, AI image generators like Midjourney, DALL-E, and Stable Diffusion do not compensate artists for their work, and most do not offer an opt-out option. Recently, OpenAI began allowing artists to opt out of training sets for DALL-E, but some artists have found the process quite difficult. The website ihavebeentrained.com, which is run by Spawning, also allows artists to see if their work is in training sets and works to eliminate them. According to a recent tweet, their efforts have gotten 78 million artworks opted out.

Still, while some large companies like Shutterstock have said that they'll abide by opt-out requests, it's unclear if everyone will jump on board. This is what makes Zhao's efforts so intriguing. In addition to Nightshade, Zhao's team has also developed Glaze, which is designed to prevent AI from stealing an artist's style. It also works by injecting an invisible pixel into artwork. Eventually, the team wants to fold Nightshade into Glaze and allow artists to decide which tool they wish to use.

Together, Zhao hopes that these tools will help tip the balance back in favor of the artist in the race to keep up with AI. Of course, Nightshade won't help artists who have already had their work used to change existing models; however, it can help artists feel more comfortable going forward.

Nightshade is a new tool with the ability to poison AI art generators.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow