this post was submitted on 12 Aug 2025
417 points (95.8% liked)
Fuck AI
3749 readers
426 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Hmmmm. This gives me an idea of an actual possible use of LLMs. This is sort of crazy, maybe, and should definitely be backed up by research.
The responses would need to be vetted by a therapist, but what if you could have the LLM act as you, and have it challenge your thoughts in your own internal monologue?
That would require an AI to be able to correctly judge maladaptive thinking from healthy thinking.
No, LLMs can't judge anything, that's half the reason this mess exists. The key here is to give the LLM enough information about how you talk to yourself in your mind for it to generate responses that sound like you do in your own head.
That's also why you have a therapist vet the responses. I can't stress that enough. It's not something you would let anyone just have and run with.
Seems like you could just ditch the LLM and keep the therapist at that point.
Shit, that sounds so terrible and SO effective. My therapist already does a version of this and it's like being slapped, I can only imagine how brutal I would be to me!
This would be great but how do you train an LLM to act as you? You'd need to be recording your thoughts and actions, not only every bit of speech you utter and every character you type on a device.
And as far as I'm aware, we don't know how to rapidly nor efficiently train transformer-based architectures anywhere near the size needed to act like chatgpt3.5, let alone 4o etc, so you'll also need to be training this thing for a while before you can start using it to introspect - by which point you may already no longer behave the same.