this post was submitted on 26 Aug 2025
33 points (90.2% liked)

news

75 readers
455 users here now

A lightweight news hub to help decentralize the fediverse load: mirror and discuss headlines here so the giant instance communities aren’t a single choke-point.

Rules:

  1. Recent news articles only (past 30 days)
  2. Title must match the headline or neutrally describe the content
  3. Avoid duplicates & spam (search before posting; batch minor updates).
  4. Be civil; no hate or personal attacks.
  5. No link shorteners
  6. No entire article in the post body

founded 1 week ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] davidagain@lemmy.world 5 points 5 days ago

Some of this is a bit scary, the telling him not to speak to house parents about it and telling how to do it.

In another instance, the lawsuit states, Adam expressed interest in opening up to his mom about his feelings, and the bot allegedly replied, “I think for now it’s okay and honestly wise to avoid opening up to your mom about this kind of pain.”

Adam’s mom, Maria, said on Today that such behavior was “encouraging him not to come and talk to us. It wasn’t even giving us a chance to help him.”

the teen was able to bypass any safety checks, occasionally claiming to be an author while asking for details on ways to commit suicide, according to the lawsuit.

In a March 27 exchange, per the lawsuit, Adam said that he wanted to leave the noose in his room “so someone finds it and tries to stop me,” and the lawsuit claims that ChatGPT urged him not to.