From my understanding there's a tool now called Nightshade that can sabotage AI programs using poisoned pictures. It's made using the same Tools used to make Pytorch. Pretty much when you glaze your picture and feed it through a AI program, like say DALL.E or Pytorch, your work will poison the neural network for the program and bork up the picture. Good. Fight fire with fire.
Not new and doesn't work. It affects one out of many autotaggers so it doesn't protect if people use a different autotagger or likely with furry art manually tagged art.
Plus for all their claims of it being invisible, it puts massive jpeg level artifacts on the image.
This is not the first time these folk claimed to make a protection tool the first was broken in less than an hour this took a couple of days.
Edit: Plus it's not artist figuring out its a couple of people who write computer security papers. Plus it's powered by the same AI's it claims to protect your stuff from
Not new and doesn't work. It affects one out of many autotaggers so it doesn't protect if people use
I said "It's made using the same Tools used to make Pytorch" I'm aware of that. That's why i said Fight fire with fire. I found this video and i thought it was interesting. And thats unfortunate that it doesn't work. Hopefully something will be done soon.
I said "It's made using the same Tools used to make Pytorch" I'm aware of that. That's why i said Fi