this post was submitted on 26 Oct 2023
72 points (100.0% liked)

technology

23218 readers
2 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
 

:sicko-blur:

you are viewing a single comment's thread
view the rest of the comments
[–] sempersigh@hexbear.net 13 points 2 years ago (1 children)

Honestly this doesn’t sound like something they can’t just train the model to recognize and account for; it’s just a short term roadblock at best.

[–] mayo_cider@hexbear.net 6 points 2 years ago

It's not that easy, since even if the neural network is trained to recognize poisoned images, you would need to remove the poisoned data from the image to be able to properly categorize it. Without the original nonpoisoned image or human intervention it's going to be exceedingly hard.

This is going to be an arms race, but luckily the AI has to find a few correct answers from a large pool of possibilities, whereas the poison has to just not produce the correct ones. This combined with the effort to retrain the models every time a new version of the poison pops up is going to keep the balance on the side of the artists at least for a while.