Connect with us

Hi, what are you looking for?

The Independent TradersThe Independent Traders

Tech News

Artists can use a data poisoning tool to confuse DALL-E and corrupt AI scraping

Image: OpenAI

Fighting against data used to train AI models has become more poisonous.

A new tool called Nightshade allows users to attach it to their creative work, and it will corrupt — or poison — training data using that art. Eventually, it can ruin future models of AI art platforms like DALL-E, Stable Diffusion, and Midjourney, removing its ability to create images.

Nightshade adds invisible changes to pixels in a piece of digital art. When the work is ingested by a model for training, the “poison” exploits a security vulnerability that confuses the model, so it will no longer read an image of a car as a car and come up with a cow instead.

The MIT Technology Review reported that Ben Zhao, a professor at the University of Chicago and one of the…

Continue reading…

You May Also Like

Tech News

Unity Earlier this week, Unity, the company that makes the Unity video game engine popular with indie developers, announced that it was changing its...

Tech News

Illustration: The Verge X CEO Linda Yaccarino announced a series of changes to her executive team, including a shakeup to the company’s sales organization...

Tech News

Image: Brazil Climate Summit At the moment I arrived at the Brazil Climate Summit event, it felt like home to me. As I opened...

Tech News

The Logitech G Pro X Superlight 2 mouse. | Photo by Sean Hollister / The Verge I called it the real magic mouse, but...