this post was submitted on 12 Aug 2025
203 points (98.1% liked)

Not The Onion

17667 readers
564 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pelespirit@sh.itjust.works 30 points 6 days ago

A 60-year-old man wound up in the hospital after seeking dietary advice from ChatGPT and accidentally poisoning himself.

According to a report published in the Annals of Internal Medicine, the man wanted to eliminate salt from his diet and asked ChatGPT for a replacement.

The AI platform recommended sodium bromide, a chemical often used in pesticides, as a substitute. The man then purchased the sodium bromide online and replaced it with salt for three months.