this post was submitted on 05 Dec 2023
6 points (75.0% liked)
Stable Diffusion
5004 readers
1 users here now
Discuss matters related to our favourite AI Art generation technology
Also see
- Stable Diffusion Art (See its sidebar for more GenAI Art comms)
- !aihorde@lemmy.dbzer0.com
Other communities
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So far im the only commentator who is fine with this. the problem with csam to me is children being molested. if its art, or stories, or basically made up and not reality. then im pretty much fine with anything. I mean I may not want to consume it myself but I don't see a problem with it.
I agree with you
With fictional content their is no child involved and their certainly isn't anything living involved
I'd just wish governments in countries that have made this class of fictional images illegal would go after real child molesters instead of makers / consumers of fictional images where no living being is involved
People who consume and make fictional content won't harm anyone
I'd agree with the caveat that a model that may be trained on actual CSAM is a problem. Anything that is an actual product of, or cannot exist without, child abuse should be absolutely prohibited and avoided.
I'm not sure whether there is such a model out there, but now that I imagine it, I assume it's inevitable there will be one. Apart from being disturbing to think about, that introduces another problem - which is, once that happens how will anyone know the model was trained on such images?
Oh I can totally agree with the training part but that can't be fought at the ai level it needs to be stopped at the csam level.