Artists may soon have teen cousins gay sex videoa new weapon helping to protect their work from becoming machine learning fodder. Called Nightshade, the tool makes tiny changes to the pixels in a digital artwork to effectively "poison" it, rendering the image useless for the purposes of training AI.
MIT Technology Review reports that a team led by University of Chicago professor Ben Zhao submitted Nightshade for peer review at the USENIX computer security conference. The software works by making small edits to an image which, while invisible to the human eye, cause AI algorithms to completely misidentify them.
For example, an artist may paint a picture of a cat that can clearly be identified as a feline by any human or AI that examines it. However, upon applying Nightshade, humans will still see the same image while AI will incorrectly believe it's a dog.
Flood the AI with enough bad training material like this, and soon a request for an image of a cat will cause it to generate a dog instead.
Of course, just one poisoned image is unlikely to have a significant effect on an AI image generator's algorithm. Its training data would need to be tainted by thousands of altered images before a real impact is seen.
However, AI image generators are known to indiscriminately scoop up thousands of new samples from the internet in order to refine their algorithm. If enough artists upload their images with Nightshade applied, it could eventually make such AI tools unusable.
It would also be incredibly difficult for AI companies to fix the issue, as each poisoned image must be individually identified and removed from their training pool. This could create a powerful incentive for such companies to think twice before dragging a trawl net through the internet and using artists' work without their explicit consent.
This isn't the first AI-disrupting tool Zhao's team has created. The group previously released Glaze, a tool that disguises an artist's personal style in a similar manner. Nightshade will eventually be integrated into Glaze, as well as made open source in order to allow others to build on their work protecting artists.
Topics Artificial Intelligence
The best memes from the 2020 Republican National ConventionNYU students use TikTok to expose the school's bleak quarantine meal planBernie Sanders and Elon Musk fight about billionaires tax on TwitterTaylor Swift donates $30,000 to help student afford university12 best gifts you can buy to support the U.S. Postal ServiceFitbit unveils its refreshed Versa 3 and Inspire 2 fitness trackers12 best gifts you can buy to support the U.S. Postal ServiceThe weirdest moments from baseball's fan13 best deepfake videos that'll mess with your brainSnap just released its first diversity report 2024 iPad Pro benchmarks crush every PC we've tested for past 6 months — except one Best smartwatch deal: Buy one, get one Galaxy Watch 6 from Samsung Despite NSFW explorations, OpenAI says porn is off the table Wordle today: The answer and hints for May 11 How to pre Wordle today: The answer and hints for May 10 How to watch every 'Law and Order' online in 2024 How Apple will keep its new iPad Pro from bending Mother's Day Lego deals: Discounted Lego Icons building sets at Amazon 'The Lord of the Rings: The Rings of Power' Season 2 teaser breakdown
0.1598s , 12401.140625 kb
Copyright © 2025 Powered by 【teen cousins gay sex video】Enter to watch online.A new tool could protect artists by sabotaging AI image generators,Feature Flash