FaceDeer

joined 2 years ago
[–] FaceDeer@fedia.io 12 points 1 year ago (5 children)

It's possible to legally photograph young people. Completely ordinary legal photographs of young people exist, from which an AI can learn the concept of what a young person looks like.

[–] FaceDeer@fedia.io 9 points 1 year ago (4 children)

Well, your philosophy runs counter to the fundamentals of Western justice systems, then.

[–] FaceDeer@fedia.io 7 points 1 year ago (3 children)

Yes. You're saying that the AI trainers must have had CSAM in their training data in order to produce an AI that is able to generate CSAM. That's simply not the case.

You also implied earlier on that these AIs "act or respond on their own", which is also not true. They only generate images when prompted to by a user.

The fact that an AI is able to generate inappropriate material just means it's a versatile tool.

[–] FaceDeer@fedia.io 6 points 1 year ago

This comment thread started with you implying that the AI was trained on illegal material, I'm really not sure how it's got to this point from that one.

[–] FaceDeer@fedia.io 12 points 1 year ago (15 children)

You obviously don't understand squat about AI.

Ha.

AI only knows what has gone through it's training data, both from the developers and the end users.

Yes, and as I've said repeatedly, it's able to synthesize novel images from the things it has learned.

If you train an AI with pictures of green cars and pictures of red apples, it'll be able to figure out how to generate images of red cars and green apples for you.

[–] FaceDeer@fedia.io 9 points 1 year ago (5 children)

You realize that there are perfectly legal photographs of female genitals out there? I've heard it's actually a rather popular photography subject on the Internet.

Do you see where I'm going with this? AI only knows what people allow it to learn...

Yes, but the point here is that the AI doesn't need to learn from any actually illegal images. You can train it on perfectly legal images of adults in pornographic situations, and also perfectly legal images of children in non-pornographic situations, and then when you ask it to generate child porn it has all the concepts it needs to generate novel images of child porn for you. The fact that it's capable of that does not in any way imply that the trainers fed it child porn in the training set, or had any intention of it being used in that specific way.

As others have analogized in this thread, if you murder someone with a hammer that doesn't make the people who manufactured the hammer guilty of anything. Hammers are perfectly legal. It's how you used it that is illegal.

[–] FaceDeer@fedia.io 10 points 1 year ago (2 children)

You suggested a situation where "many people would get off charges of real CSAM because the prosecuter can't prove that it wasn't AI generated." That implies that in that situation AI-generated CSAM is legal. If it's not legal then what does it matter if it's AI-generated or not?

[–] FaceDeer@fedia.io 13 points 1 year ago (17 children)

First, you need to figure out exactly what it is that the "blame" is for.

If the problem is the abuse of children, well, none of that actually happened in this case so there's no blame to begin with.

If the problem is possession of CSAM, then that's on the guy who generated them since they didn't exist at any point before then. The trainers wouldn't have needed to have any of that in the training set so if you want to blame them you're going to need to do a completely separate investigation into that, the ability of the AI to generate images like that doesn't prove anything.

If the problem is the creation of CSAM, then again, it's the guy who generated them.

If it's the provision of general-purpose art tools that were later used to create CSAM, then sure, the AI trainers are in trouble. As are the camera makers and the pencil makers, as I mentioned sarcastically in my first comment.

[–] FaceDeer@fedia.io 22 points 1 year ago (14 children)

Better a dozen innocent men go to prison than one guilty man go free?

view more: ‹ prev next ›