this post was submitted on 12 Aug 2025
416 points (95.8% liked)

Fuck AI

3735 readers
483 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

Ran into this, it's just unbelievably sad.

"I never properly grieved until this point" - yeah buddy, it seems like you never started. Everybody grieves in their own way, but this doesn't seem healthy.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] NoodlePoint@lemmy.world 8 points 2 days ago

Six years ago was expecting this to be a mere novelty. Now it's a fucking threat.

[–] moseschrute@lemmy.zip 21 points 3 days ago (11 children)
load more comments (11 replies)
[–] Rumo161@feddit.org 20 points 3 days ago

Big Brother is now also this guys dead wife.

[–] Honytawk@feddit.nl -3 points 1 day ago (1 children)

It literally says the wife was killed in a car accident.

What kind of dumb clickbaity title is this crap? Was it generated by AI or something?

load more comments (1 replies)
[–] tazeycrazy@feddit.uk 27 points 3 days ago (1 children)

You have two deaths the death when you no longer alive physically. But you also have a second death when the last person will speek your name or the last person who knew you also dies. This may in a hacky way create a third death. The last message that your post mortem avatar speaks like you. What have we released into the world. I hope this guy can handle the psychological experiment that this is bringing us.

[–] webghost0101@sopuli.xyz 23 points 3 days ago (1 children)

There are already more then 3, this wont really change them.

  1. Physical death

  2. When the last person that knew you dies/forgets

  3. When the last record of your life disappears (photo/certificate)

  4. When such a vast amount of time washes over rendering any and all actions you had on the universe unmeasurable even to an all knowing entity. (Post Heat death vs butter fly effect)

load more comments (1 replies)
[–] ZDL@lazysoci.al 23 points 3 days ago

Maybe it's time to start properly grieving instead of latching onto a simulacrum of your dead wife? Just putting that out there for the original poster (not the OP here, to be clear)?

[–] ideonek@piefed.social 20 points 3 days ago (1 children)

I guarantee you that - if not already a thing - a "capabilities" like this will be used as a marketing selling point sooner or latter. It only remains to be seen if this will be openly marketed or only "whispered", disguised as the cautionary tales.

[–] magic_lobster_party@fedia.io 19 points 3 days ago

This is definitely going to become a thing. Upload chat conversations, images and videos, and you’ll get your loved one back.

Massive privacy concern.

[–] mavu@discuss.tchncs.de 10 points 3 days ago (1 children)

Looks like OpenAI crawled this blog: https://medium.com/@saujanyapdl22/what-my-heart-h-d3c9845badfb among the myriad other things. I wonder if the author approves.

[–] NikkiDimes@lemmy.world 6 points 2 days ago* (last edited 2 days ago)

...that's a lot of em dashes. I don't know if the similar language is because this influenced ChatGPT.


Edit:

Indeed, I found what I believe to be her LinkedIn. She's a self proclaimed "tech explorer" going to school for "Computer Applications in an Information Technology Course." Unlike what is written in her article, her bio has multiple typos and not a single em dash. Perhaps this LinkedIn profile is someone else's, but if it's hers I'd assume this is definitely AI written. Additionally, even her Medium about me bio has some issues that do not occur within her writing elsewhere.

Sad as hell, if true. I'd rather see a poorly written, typo laden mess from the heart of a human being filled with emotion than edited AI slop.

[–] peoplebeproblems@midwest.social 14 points 3 days ago (5 children)

Hmmmm. This gives me an idea of an actual possible use of LLMs. This is sort of crazy, maybe, and should definitely be backed up by research.

The responses would need to be vetted by a therapist, but what if you could have the LLM act as you, and have it challenge your thoughts in your own internal monologue?

[–] JoeBigelow@lemmy.ca 17 points 3 days ago

Shit, that sounds so terrible and SO effective. My therapist already does a version of this and it's like being slapped, I can only imagine how brutal I would be to me!

[–] Jayjader@jlai.lu 8 points 3 days ago

This would be great but how do you train an LLM to act as you? You'd need to be recording your thoughts and actions, not only every bit of speech you utter and every character you type on a device.

And as far as I'm aware, we don't know how to rapidly nor efficiently train transformer-based architectures anywhere near the size needed to act like chatgpt3.5, let alone 4o etc, so you'll also need to be training this thing for a while before you can start using it to introspect - by which point you may already no longer behave the same.

load more comments (3 replies)
[–] july@leminal.space 17 points 3 days ago

It’s far easier for one’s emotions to gather a distraction rather than go through the whole process of grief.

[–] tmyakal 7 points 3 days ago (1 children)

This reminds me of a Joey Comeau story I saw published fifteen or twenty years ago. The narrator's wife had died, so he programed a computer to read her old journal entries aloud in her voice, at random throughout the day.

It was profoundly depressing and meant to be fiction, but I guess every day we're making manifest the worst parts of our collective imagination.

load more comments (1 replies)
load more comments
view more: ‹ prev next ›