this post was submitted on 11 Dec 2025
548 points (96.6% liked)
Technology
77631 readers
1551 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
they obviously did if they banned him for it; and if they're training on csam and refuse to do anything about it then yeah they have a connection to it.
Also, the data set wasn't hosted, created, or explicitly used by Google in any way.
It was a common data set used in various academic papers on training nudity detectors.
Did you seriously just read the headline, guess what happened, and are now arguing based on that guess that I, who actually read the article, am wrong about it's content ? Because that's sure what it feels like reading your comments......
Google doesn't ban for hate or feels, they ban by algorithm. The algorithms address legal responsibilities and concerns. Are the algorithms perfect? No. Are they good? Debatable. Is it possible to replace those algorithms with "thinking human beings" that do a better job? Also debatable, from a legal standpoint they're probably much better off arguing from a position of algorithm vs human training.
So you didn't read my comment then did you ?
He got banned because Google's automated monitoring system, entirely correctly, detected that the content he unzipped contained CSAM. It wasn't even a manual decision to ban him.
His ban had literally nothing whatsoever to do with the fact that the CSAM was part of an AI training data set.