A new tool used by creators to “poison” artificial intelligence models and stop their artwork from being used without their consent has been downloaded more than 250,000 times in just five days.
Nightshade, a free tool created by computer science researchers at the University of Chicago, is the latest “weapon” to prevent artists’ work from becoming “machine learning fodder,” Mashable said.
Ben Zhao, a computer science professor and leader of the project, said he expects there will be “very high enthusiasm” but that he is still “underestimating” the extent. In an email to VentureBeat after it surpassed 250,000 downloads earlier this month, he said the response was “absolutely beyond our wildest expectations.”
apply 1 week
Escape from the echo chamber. Get the facts behind the news and analysis from multiple perspectives.
Subscribe and save
Sign up for this week’s free newsletter
From our morning news briefing to our weekly Good News newsletter, get the week’s best stories delivered straight to your inbox.
From our morning news briefing to our weekly Good News newsletter, get the week’s best stories delivered straight to your inbox.
Why is it necessary?
The Evening Standard reported that the use of generative artificial intelligence (AI) models, which can create text and images that mimic the work of others, is causing “an uproar among creators”.
The sudden emergence and popularity of tools like DALL-E, Midjourney, and Stable Diffusion has become an existential threat for artists, who are now fighting an “uphill battle” to protect their work from being used to train AI models without their consent. ”, TechCrunch said. “While opt-out requirements and anti-scraping codes rely on good faith responses from AI companies, such measures can easily be bypassed by companies motivated by profit over privacy,” the tech site said. ing.
Artificial intelligence was one of the main factors behind last year’s prolonged Hollywood writers’ strike, while other companies decided to take legal action against AI giants such as Meta and OpenAI, alleging copyright infringement. There is. Actors have protested against the rise of digital replicas, and celebrities have expressed concern about their voices being duplicated without their consent.
But for the majority of artists, with no job security and relying on social media exposure for commissions, going on strike or taking their work offline is not an option.
How do nightshades work?
Nightshade, named after the deadly nightshade plant used to poison a Roman emperor, “exploits inherent security vulnerabilities in AI models and allows for tampering,” Standard says. said. “It does this by making changes to the pixels of a digital image that are invisible to the human eye. These changes affect the image itself and the text or captions associated with it. AI to identify what’s in the photo.” ”
This is known as an immediate specific poisoning attack and can cause AI tools to malfunction, for example mistaking a photo of a cat for a dog or mistaking a handbag for a toaster.
The Register says, “This kind of unpredictable response greatly reduces the usefulness of text-to-image models, meaning modelers have no incentive to train only on data provided for free. It means something.”
Zhao likened this to “putting hot sauce in your lunch so it won’t be stolen from the refrigerator at work.”
Mashable said that if applied at scale, it would be “incredibly difficult” for AI companies to fix, adding that “each poisoned image would need to be individually identified and removed from the training pool.” “This could create a strong incentive for such companies to think ahead.” Trawling through the Internet and using an artist’s work without their explicit consent. ”
According to The Register, the Nightshade team plans to release Nightshade in conjunction with its previous tool Glaze, which has the ability to prevent AI models from learning an artist’s signature “style” by subtly changing pixels. It is said that they are considering it.
Will it work?
In a research paper published last October, the team says Nightshade is a “powerful tool for protecting intellectual property from model trainers who ignore or ignore copyright notices, no-scraping/no-crawl instructions, and opt-out lists.” We can provide tools to content owners.” ”.
According to TechCrunch, the goal is not to take down Big AI, but to “force tech giants to pay for licensed work in exchange for training AI models on scraped images.”
“There’s a right way to do this,” Zhao agreed. “The real issue here is around consent and compensation. We’re just giving content creators a way to stop unauthorized training.”
Illustrator Eva Tourenent told MIT Technology Review that she hopes Nightshade will change the status quo.
“it is, [AI companies] “Think twice, because they could take our work without our consent and destroy the entire model,” she said, adding that another artist, Autumn Beverly, said the tool “It helps artists take back power over their own work.” .