this post was submitted on 05 Dec 2023
6 points (75.0% liked)

Stable Diffusion

5004 readers
5 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 2 years ago
MODERATORS
 

While I think this is a bit overblown in sensationalism, any company that allows user generative AI, especially as open as using LoRas and any amount of checkpoints, needs to have very good protection against synthetic CSAM like this. To the best of my knowledge, only the AI Horde has taken this sufficiently seriously until now.

you are viewing a single comment's thread
view the rest of the comments
[–] x4740N@lemmy.world 2 points 2 years ago

I agree with you

With fictional content their is no child involved and their certainly isn't anything living involved

I'd just wish governments in countries that have made this class of fictional images illegal would go after real child molesters instead of makers / consumers of fictional images where no living being is involved

People who consume and make fictional content won't harm anyone