this post was submitted on 28 Jul 2025
134 points (100.0% liked)
Technology
39791 readers
275 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So what are they doing with the data? Is this all being fed into the LLM or image generating AI to create ultra realistic porn? To what end? I don't see their endgame unless it involves sexbots.
Pure speculation: ;possibly to identify sexual nudity and "inappropriate" content as some kind of legitimate usecase. What was actually done, I have no idea.
This feels most likely to me.
Meta doesn't exactly want to taint their brand image with purely sexual content being generated by their base models, so it's probably for either content classification, and/or the also likely fine-tuning of their LLMs and other generative models in reverse - that is to say, fine tuning them to not create content that is like what they're then being fed.
A lot of artists will practice anatomy by drawing people nude, largely because it’s hard to get a good understanding of anatomy by only drawing people with clothes on.
If you wanted to put some examples of bare human anatomy in odd positions to expand the range that the model is capable of, well there aren’t many larger corpuses of that than porn.
Also, even if they don’t want it to make explicit content, they probably want it to make “suggestive” or “appealing” content. And they just assume they can guide rail it away from making actual explicit content. Although that’s probably pretty short sighted given how weak guardrails really are.
Let's be honest now... Zuckerberg is building a globally-distributed, industrial-scale, disaster-proof spank bank for himself.
Well, there's also censorbots.