Nightshade is a “data poisoning tool” developed at the University of Chicago to confuse AI programs that generate images. Here’s how it works.
{Categories} _Category: Platforms,*ALL*{/Categories}
{URL}https://www.weforum.org/agenda/2023/11/nightshade-generative-ai-poison/{/URL}
{Author}Victoria Masterson{/Author}
{Image}https://assets.weforum.org/editor/XOi84RepF5wFJ3A01JQzIRdnpCqm9qVc9u20oGF965A.jpeg{/Image}
{Keywords}{/Keywords}
{Source}Platforms{/Source}
{Thumb}{/Thumb}