Daxtron2

joined 2 years ago
[–] Daxtron2@startrek.website 1 points 2 years ago (1 children)

I've seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.

[–] Daxtron2@startrek.website 2 points 2 years ago* (last edited 2 years ago) (4 children)

Yes I know, it definitely should be curbed. However, better privacy laws could still be written to disallow that type of manipulative content.

[–] Daxtron2@startrek.website 5 points 2 years ago (1 children)

What exactly do you think erotic roleplay means?

[–] Daxtron2@startrek.website 17 points 2 years ago (6 children)

Would've preferred better privacy laws over a direct ban imo

[–] Daxtron2@startrek.website 12 points 2 years ago

Thank you, I've been trying to get this point across for months

[–] Daxtron2@startrek.website 4 points 2 years ago (6 children)

How can text ever possibly be CSAM when there's no child or sexual abuse involved?

[–] Daxtron2@startrek.website 3 points 2 years ago

People really need to work on their ability to spot fakes. My mom recently showed me these and I immediately said, "that's ai generated"

[–] Daxtron2@startrek.website 6 points 2 years ago

Digestion itself is a rotting process using our gut bacteria

[–] Daxtron2@startrek.website 2 points 2 years ago

You got it backwards, it's a media run state.

[–] Daxtron2@startrek.website 2 points 2 years ago

Certainly makes it easy when they expose themselves

[–] Daxtron2@startrek.website 7 points 2 years ago

I think it was poor wording but he's saying: trump = current genocide worsens + new genocides in US. Biden = current genocide stays the same, no new genocide in US.

[–] Daxtron2@startrek.website 7 points 2 years ago

Certainly could be. Biden should absolutely be criticized for his shortcomings, as any president should. But there's no better option right now so it's really about damage control more than anything

view more: ‹ prev next ›