this post was submitted on 14 Aug 2025
122 points (95.5% liked)

News

31652 readers
2610 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
 

A cognitively impaired New Jersey man grew infatuated with “Big sis Billie,” a Facebook Messenger chatbot with a young woman’s persona. His fatal attraction puts a spotlight on Meta’s AI guidelines, which have let chatbots make things up and engage in ‘sensual’ banter with children.

When Thongbue Wongbandue began packing to visit a friend in New York City one morning in March, his wife Linda became alarmed.

“But you don’t know anyone in the city anymore,” she told him. Bue, as his friends called him, hadn’t lived in the city in decades. And at 76, his family says, he was in a diminished state: He’d suffered a stroke nearly a decade ago and had recently gotten lost walking in his neighborhood in Piscataway, New Jersey.

Bue brushed off his wife’s questions about who he was visiting. “My thought was that he was being scammed to go into the city and be robbed,” Linda said.

She had been right to worry: Her husband never returned home alive. But Bue wasn’t the victim of a robber. He had been lured to a rendezvous with a young, beautiful woman he had met online. Or so he thought.

top 25 comments
sorted by: hot top controversial new old
[–] Nollij@sopuli.xyz 76 points 3 days ago (2 children)

For anyone wondering the obvious:

Rushing in the dark with a roller-bag suitcase to catch a train to meet her, Bue fell near a parking lot on a Rutgers University campus in New Brunswick, New Jersey, injuring his head and neck. After three days on life support and surrounded by his family, he was pronounced dead on March 28.

He died because of an incidental fall, not anything meaningful to do with the chat bot.

[–] onslaught545@lemmy.zip 46 points 2 days ago (1 children)

I'd argue he wouldn't have been rushing to catch a train if not for the AI chat bot.

[–] frongt@lemmy.zip 4 points 2 days ago

True, but it was not the proximate cause of death.

[–] salacious_coaster -1 points 3 days ago
[–] ShaggySnacks@lemmy.myserv.one 36 points 2 days ago (1 children)

“It is acceptable to engage a child in conversations that are romantic or sensual,” according to Meta’s “GenAI: Content Risk Standards.” The standards are used by Meta staff and contractors who build and train the company’s generative AI products, defining what they should and shouldn’t treat as permissible chatbot behavior. Meta said it struck that provision after Reuters inquired about the document earlier this month.

sensual from Cambridge Dictionary: expressing or suggesting physical, especially sexual, pleasure or satisfaction:

What. The. Fuck.

Apparently we can build robots to seduce children, but we can't tell them that LGBTQ+ people exist because that would overly sexualize them.

[–] pennomi@lemmy.world 22 points 3 days ago (1 children)

Reminds me of Arthur C. Clarke’s 3001 (a sequel to 2001):

It did not matter; he was enjoying the novel experience - and could appreciate how addictive it could become. The ’dream machines’ that many scientists of his own century had anticipated - often with alarm - were now part of everyday life. Poole wondered how Mankind had managed to survive: he had been told that much of it had not. Millions had been brain-burned, and had dropped out of life.

[–] The_Picard_Maneuver@piefed.world 7 points 3 days ago (2 children)
[–] jordanlund@lemmy.world 9 points 2 days ago

Ray Bradbury touches on it as well in The Veldt and Fahrenheit 451.

[–] pennomi@lemmy.world 5 points 2 days ago

3001 is my favorite book, highly recommended. Though I admit it’s more of “guy from 2001 lands in 3001 and woah technology” than something with a huge amount of plot.

[–] thann@lemmy.dbzer0.com 15 points 2 days ago

A chatbot pretending to be real and seducing someone to travel has to be fraud right?

[–] chetradley@lemmy.world 17 points 3 days ago (1 children)

Catfished to death by a chatbot is a rough way to go.

[–] jordanlund@lemmy.world -3 points 2 days ago (3 children)

I mean, he fell running for a train, that could have happened without the chatbot.

It's not like he showed up at the fake address and got killed or something.

He wouldn't have been trying to catch the train if he wasn't trying to meet up with the chatbot though.

[–] sad_detective_man@leminal.space 18 points 2 days ago (2 children)

he was lured to go alone by the prospect of an affair. he's both a dirtbag but also a victim of a machine designed to exploit the mentally feeble.

don't let the complexity of a situation cause you to defend a fucking corporation. you are nothing to them.

[–] thann@lemmy.dbzer0.com 5 points 2 days ago (1 children)

The thought of poon pushed him over

[–] sad_detective_man@leminal.space 6 points 2 days ago* (last edited 2 days ago)

same, tbh but I'd like to pretend it would take more than Zuckerberg's horned up roomba to send me concrete-surfing

[–] SlippiHUD@lemmy.world 4 points 2 days ago (1 children)

Based on my reading of the article, its entirely possible this man was not out for an affair.

He was wanting to meet before the bot got very flirty, and pumped the brakes about the idea of getting physical.

Do I think he was making good decisions? No.

But I think we should give a little benefit of the doubt to a dead man, who had his mental capacity demished by a stroke, who was trying to meet a chatbot, owned and operated by Meta.

[–] sad_detective_man@leminal.space 10 points 2 days ago

honestly I think it's weird that the conversation is about him at all. feels like the focus should be on the slopcode sex pest that told a human to meet it somewhere irl. for profit. for a social network's engagement quota.

[–] ech@lemmy.ca 3 points 2 days ago (1 children)

It's not like he showed up at the fake address and got killed or something.

Because that would be the fault of the chatbot, but no other part of his journey would be? The journey made explicitly to meet a person that didn't exist?

[–] jordanlund@lemmy.world 3 points 2 days ago (1 children)

Guy tripped and fell because he was out of the house, the reason he was out of the house is incidental. Could just as easily have happened because he was going to the store, or taking a walk.

People are quick to blame Meta because Lemmy hates corpos and AI, but this sort of shit happens to the elderly and infirm literally all the time.

[–] ech@lemmy.ca 1 points 2 days ago (1 children)

So if a human person lured the man out of the house on false pretenses, you'd really argue they shared no blame at all? That it's just "something that happens"?

[–] jordanlund@lemmy.world 3 points 2 days ago (1 children)

If he went out of the house to go talk to someone he saw in the street, tripped, fell, and died, that's not the fault of the person in the street.

Yes, it's something that happens. All the time.

[–] ech@lemmy.ca 1 points 2 days ago

Abuse of the elderly happens all the time, too. That doesn't make it ok. And he didn't just go to meet someone "he saw in the street". He was convinced to travel to meet a person that wasn't real. Those are not the same situation.

[–] aesthelete@lemmy.world 4 points 2 days ago