this post was submitted on 12 Aug 2025
416 points (95.8% liked)

Fuck AI

3735 readers
462 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

Ran into this, it's just unbelievably sad.

"I never properly grieved until this point" - yeah buddy, it seems like you never started. Everybody grieves in their own way, but this doesn't seem healthy.

you are viewing a single comment's thread
view the rest of the comments
[–] prole@lemmy.blahaj.zone 94 points 3 days ago (3 children)

We are witnessing the emergence of a new mental illness in real time.

[–] Denjin@feddit.uk 42 points 2 days ago (2 children)

Sadly this phenomenon isn't even new. It's been here for as long as chatbots have.

The first "AI" chatbot was ELIZA made by Joseph Weizenbaum. It literally just repeated back to you what you said to it.

"I feel depressed"

"why do you feel depressed"

He thought it was a fun distraction but was shocked when his secretary, who he encouraged to try it, made him leave the room when she talked to it because she was treating it like a psychotherapist.

[–] JackbyDev@programming.dev 32 points 2 days ago (2 children)

Turns out the Turing test never mattered when we've been willing to suspend our disbelief all along.

[–] ZDL@lazysoci.al 7 points 2 days ago

The question has never been "will computers pass the Turing test?" It has always been "when will humans stop failing the Turing test?"

[–] UltraMagnus@startrek.website 6 points 2 days ago (1 children)

Part of me wonders if the way our brains humanize chat bots is similar to how our brains humanize characters in a story. Though I suppose the difference there would be that characters in a story could be seen as the author's form of communicating with people, so in many stories there is genuine emotion behind them.

[–] bobbyguy@lemmy.world 3 points 1 day ago

i feel like there must be some instinctual reaction where your brain goes: oh look! i can communicate with it, it must be a person!

and with this guy specifically it was: if it acts like my wife and i cant see my wife, it must be my wife

its not a bad thing that this guy found a way to cope, the bad part is that he went to a product made by a corporation, but if this genuinely helped him i don't think we can judge

Now you got me remembering Dr. Sbaitso. Tell me about your problems.

[–] net00@lemmy.today 22 points 2 days ago (2 children)

Yeah, the chatgpt subreddit is full of stories like this now that GPT5 went live. This isn't a weird isolated case. I had no clue people were unironically creating friends and family and else with it.

Is it actually that hard to talk to another human?

[–] Lumisal@lemmy.world 10 points 2 days ago (1 children)

I think it's more that many countries don't have affordable mental healthcare.

It costs a lot more to pay for a therapist than to use an LLM.

And a lot of people need therapy.

[–] S0ck@lemmy.world 5 points 2 days ago

The robots don't judge, either. And you can be as cruel, as stupid, as mindless as you want. And they will tell you how amazing and special you are.

Advertising was the science of psychological warfare, and AI is trained with all the tools and methods for manipulating humans. We're devastatingly fucked.

[–] ArrrborDAY@lemmy.dbzer0.com 11 points 2 days ago

Delusional conditions aren't new. Just how they are facilitated changes.