University of Chicago researchers release Nightshade to public, a tool that is intended to "poison" pictures in order to ruin generative models trained on them.
zepplenzap @ zepplenzap @lemmy.sdf.org Posts 0Comments 6Joined 2 yr. ago
zepplenzap @ zepplenzap @lemmy.sdf.org
Posts
0
Comments
6
Joined
2 yr. ago
If this technology is so great, why does the site not show any before / after examples? Let alone demonstrating that it does what he claims?