This Free Tool ‘Poisons’ AI Models to Prevent Them From Stealing Your Work — But Some Say It’s Akin to ‘Illegal’ Hacking

Artists have long sought ways to protect their creative works from unsanctioned use, particularly by AI models that train on vast swathes of internet data, often without permission.

Enter Nightshade v1.0: a cutting-edge tool released by computer scientists at the University of Chicago that provides artists with a digital shield to guard their creations against unwanted AI consumption, VentureBeat reported.

Related: What Will It Take to Build a Truly Ethical AI? These 3 Tips Can Help.

Nightshade is the “offensive” counterpart to its predecessor, Glaze — a tool designed to obfuscate an artist’s style from AIs.

Glaze’s changes to works of art are “like UV light” — not detectable by the naked eye. “The models, they have mathematical functions looking at images very, very differently from just how the human eye looks,” Shawn Shan, a grad researcher at the University of Chicago, told IT Brew.

Similarly, Nightshade embeds pixel-level alterations inconspicuous to the human eye within an artwork — but its tweaks effectively serve as a “poison” hallucinogenic for AI, causing it to misinterpret the content entirely, according to VentureBeat. Pictures of pastoral scenes might suddenly be recognized by AI as fashionable accessories — for example, a cow becomes a leather purse.

The tool is tailored for users with Apple‘s M1, M2 or M3 chip-equipped Macs or PCs running on Windows 10 or 11.

Related: Google Sues Hackers For Making Fake Advertisements to Download Bard AI Technology

Many artists, including Kelly McKernan — a plaintiff in the highly publicized copyright infringement class-action lawsuit against AI art firms, including Midjourney and DeviantArt — have welcomed Nightshade with open arms, per the outlet. However, critics denounce the tool as a veiled attack on AI models and companies, with one going so far as to call it “illegal” hacking.

The development team behind Nightshade stands by their creation, arguing that their intention is not to wreak havoc upon AI models but to tip the economic scales, making it less financially viable to ignore artists’ copyrights — and more attractive to engage in lawful licensing agreements.

This post was originally published on this site