this post was submitted on 05 Dec 2023
6 points (75.0% liked)
Stable Diffusion
5004 readers
5 users here now
Discuss matters related to our favourite AI Art generation technology
Also see
- Stable Diffusion Art (See its sidebar for more GenAI Art comms)
- !aihorde@lemmy.dbzer0.com
Other communities
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'd agree with the caveat that a model that may be trained on actual CSAM is a problem. Anything that is an actual product of, or cannot exist without, child abuse should be absolutely prohibited and avoided.
I'm not sure whether there is such a model out there, but now that I imagine it, I assume it's inevitable there will be one. Apart from being disturbing to think about, that introduces another problem - which is, once that happens how will anyone know the model was trained on such images?
Oh I can totally agree with the training part but that can't be fought at the ai level it needs to be stopped at the csam level.