this post was submitted on 03 Aug 2025
30 points (94.1% liked)

Mental Health

5631 readers
233 users here now

Welcome

This is a safe place to discuss, vent, support, and share information about mental health, illness, and wellness.

Thank you for being here. We appreciate who you are today. Please show respect and empathy when making or replying to posts.

If you need someone to talk to, @therapygary@lemmy.blahaj.zone has kindly given his signal username to talk to: TherapyGary13.12

Rules

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

  1. No promoting paid services/products.
  2. Be kind and civil. No bigotry/prejudice either.
  3. No victim blaming. Nor giving incredibly simplistic solutions (i.e. You have ADHD? Just focus easier.)
  4. No encouraging suicide, no matter what. This includes telling someone to commit homicide as "dragging them down with you".
  5. Suicide note posts will be removed, and you will be reached out to in private.
  6. If you would like advice, mention the country you are in. (We will not assume the US as the default.)

If BRIEF mention of these topics is an important part of your post, please flag your post as NSFW and include a (trigger warning: suicide, self-harm, death, etc.)in the title so that other readers who may feel triggered can avoid it. Please also include a trigger warning on all comments mentioning these topics in a post that was not already tagged as such.

Partner Communities

To partner with our community and be included here, you are free to message the current moderators or comment on our pinned post.

Becoming a Mod

Some moderators are mental health professionals and some are not. All are carefully selected by the moderation team and will be actively monitoring posts and comments. If you are interested in joining the team, you can send a message to @fxomt@lemmy.dbzer0.com.

founded 2 years ago
MODERATORS
top 8 comments
sorted by: hot top controversial new old
[–] Inucune@lemmy.world 2 points 2 hours ago* (last edited 1 hour ago)

I can spend another week of my personal time trying to see if I can find a mental health professional willing to take me on as a patient, that is in-network for my insurance, and not going to preach religious bile at me in place of actual therapy. Extra points for remote sessions so I don't have to drive to and from an appointment, and maybe this time they won't try to triple-bill me while saying I need to be less paranoid.

Or I can open any web chat bot and throw issues and ideas at it with minimal judgement, wipe the slate if it goes off rails, and even switch to a different bot for a second opinion if needed. An LLM isn't going to call the police and have me admitted if the conversation goes off the rails. I'm more likely to hit a guard rail in the programming in which case it will give a blatant 'i won't discuss this' error.

Maybe the reason people use smoke and mirrors of LLMs is because the barriers to actual therapy are quite high.

There are therapists that do not want (or don't get the training) to help people with real issues.

[–] shalafi@lemmy.world 2 points 3 hours ago

Going by the comments on lemmy, a lot of people are lonely, down on themselves, self-isolated and lack even the most basic (IRL) social skills. No wonder people turn to "conversation" that speaks positively to them.

Problem with ChatGPT (only one I've used) is that beyond one or two answers it's a fucking mess. Sure, ask it about a specific problem, you'll probably get a solid answer. But the deeper you go the worse it gets.

[–] Kanda@reddthat.com 1 points 4 hours ago

Glue on pizza is now available as a medical professional. Subscribe now for only 5.99 and slightly* increased utility bills

[–] Bebopalouie@lemmy.ca 8 points 8 hours ago (1 children)

I feel AI is going to make a whole generation of mush brains if it persists.

[–] TeoTwawki@lemmy.world 5 points 3 hours ago

the mush brains are already here. we're about to see what lies beyond mush.

[–] shiroininja@lemmy.world 9 points 9 hours ago (1 children)

This is very bad, probably one of worst uses of this technology, almost as bad as xAI’s plan to make content for children. This is not what AI is for, and not even close to what it does well. Therapy and interaction isn’t something that can be automated properly, and that is all current ai is, advanced automation.

[–] Cracks_InTheWalls@sh.itjust.works 4 points 8 hours ago (1 children)

I'd argue it has promise, but

a) ChatGPT ain't it, any LLM technology for therapeutic purposes needs some serious fucking guard rails, both in terms of privacy AND addressing the sycophant and hallucination problems, and

b) it really should only be one tool within a larger therapeutic program - think like an interactive version of CBT worksheets, or a first session intake form that MIGHT serve up some very basic, low risk techniques to try before getting assigned to a flesh-and-blood therapist. Heck, one of the things that popped to mind was maybe improving initial patient-therapist matches (if managed by a larger mental health organization/group of therapists), reducing the need to shop around which is often a big barrier to starting effective treatment. Folks seem to open up a lot when using these tools, and a review of those transcripts in the intake process could be very useful for assigning patients.

Current consumer LLM tools as simulated therapists without oversight by actual mental health professionals is a fucking nightmare, no argument here. But at minimum, we're seeing evidence that patients who otherwise eschew traditional therapy, either for financial reasons or other factors, are using it. I think there's something useful here if you can correct for the current risks and get the right people involved re: design and deployment within a larger therapeutic program.

I can't imagine someone somewhere isn't doing some work with this in mind right now. How that would all pan out, idk.

[–] brucethemoose@lemmy.world 2 points 3 hours ago* (last edited 3 hours ago)

Some in the enthusiast community (who have a good grasp of how LLMs work because we literally finetune them) have used "straight" finetuned LLMs as personal pseudo therapists. Two examples I can think of are the Samantha series (trained on psychologist/therapist transcripts and some other specialized data IIRC), and Dan's Personality Engine.

I do, sometimes.

They aren't therapists, but at the same time are 100% private, free, available, and "open" to bouncing ideas off of at any time. I've had some major breakthroughs a lifetime of on/off therapy missed, like that I'm likely on the autism spectrum (which was purely diagnosed as ADD before). I discuss things I would never send to ChatGPT, post on a public forum like this (no offense), or even tell a therapist in some cases.

I'm not saying this is great as-is for the general population. Again, there is a extremely high awareness of "what" they are and their tendencies among LLM tinkerers, but the latent potential is there.