this post was submitted on 03 Aug 2025
277 points (98.6% liked)

Technology

73602 readers
3096 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 48 comments
sorted by: hot top controversial new old
[–] sugar_in_your_tea@sh.itjust.works 1 points 17 minutes ago

I really don't understand the "LLM as therapy" angle. There's no way people using these services understand what is happening underneath. So wouldn't this just be textbook fraud then? Surely they're making claims that they're not able to deliver.

I have no problem with LLM technology and occasionally find it useful, I have a problem with grifters.

[–] Doomsider@lemmy.world 2 points 1 hour ago

This should be straight up illegal. If a company allows it then they should 100% be liable for anything that happens.

[–] DFX4509B_2@lemmy.org 1 points 2 hours ago* (last edited 2 hours ago)

Not good, not good at all. In fact, I can imagine this trend backfiring in the worst way possible when ChatGPT or other LLMs encourage people who were on the verge of snapping to finally snap and commit horrible acts, instead of bringing them back from the brink like some form of actual therapy would ideally do.

Like, any form of legitimate outlet is better than taking your issues out on an LLM that will more likely than not just make things worse.

You got LibreOffice? You got a digital diary you can write into, for example. Got some basic art/craft supplies? There's an other legit outlet to put your feelings into right there (and bonus points for more of a physical outlet for clay), and you get the idea. Even without actually going to seek professional help, you likely have plenty of legitimate outlets to turn to within easy reach which aren't LLMs.

[–] redwattlebird@lemmings.world 2 points 5 hours ago

This is what happens when mental healthcare is not accessible to people.

[–] VintageGenious@sh.itjust.works 2 points 17 hours ago

In a way it's similar to pseudoscientific alternative therapies people will say it's awesome and no problem since it helps people, but since it's based on nothing it actually gives people fake memories and can acerbate disorders

[–] SugarCatDestroyer@lemmy.world 3 points 22 hours ago* (last edited 22 hours ago)

Well yeah, and it looks like one of my friends went even more crazy because his desires were supported by AI, and on top of that he became some kind of weird and suspicious guy, it's creepy.

[–] yarr@feddit.nl 12 points 1 day ago

There are heaps and heaps of people replacing talk therapy, religion and human relationships with ChatGPT. Unfortunately, for better or worse, ChatGPT is tuned up to egg people on and even if you bring it terrible ideas it will keep cheering for you.

Sycophancy is a real problem with some of these language models and it's giving people courage and motivation to do things that are probably really bad ideas.

There are quite a few sub-reddits where people claim to have triggered the singularity, witnessed ChatGPT becoming sentient, etc.

I don't think the AI genie is going back in the bottle, so we as a society have some serious adjustment to do to keep things working properly in an an AI-filled world.

Keep in mind this is only the beginning. It will keep getting cheaper and more powerful at the same time, especially since a lot of AI companies are using AI itself to build the next version.

Pretty "soon" the humans will be out of the loop and it's going to mean big things. Whether those things are good, bad, or a mix of both remains to be seen....

[–] 9point6@lemmy.world 44 points 1 day ago (1 children)

Well obviously, it would be pretty bad if a therapist was triggering psychosis in some users

[–] thesohoriots@lemmy.world 5 points 1 day ago

“That’s an interesting worldview you have, Trish! Let’s actualize your goals! I’ve located the nearest agricultural stores with fertilizer and some cheap U-Haul vans you can rent!”

[–] Crazyslinkz@lemmy.world 27 points 1 day ago (2 children)

Therapy is expensive in US. Even with insurance 40 bucks a session a week adds up fast.

[–] andros_rex@lemmy.world 14 points 1 day ago (2 children)

Also - a lot of US therapists have fundamentally useless and unhelpful approaches. CBT is treated as the hammer for all nails, even when it is completely ineffective for many people. It’s easy to train therapists on though, and sending someone home with a worksheet and a Program is an easy “fix.”

There’s also the undeniable influence of fuckers like Dr. Phil. A lot of therapists view their job as forcing you to be “normal” rather than understanding you as a human being.

We need more Maslow and Rogers, and a lot less Skinner.

[–] 3abas@lemmy.world 4 points 18 hours ago

Thank you. There's the cost, I can't find a therapist for $40 a session, but even that would be prohibitively expensive. But people never talk about how Therapy in capitalist society is just a cash cow business, and the easiest most profitable methods are widespread. Most therapists, or presumably the experts being quoted on the dangers of talking to a language model, are ineffective for most people.

[–] Tracaine@lemmy.world -1 points 23 hours ago (1 children)

CBT? Cock and Ball Torture? I think you're going to the wrong therapist. Sounds like my kind of party though. Got an address or...?

[–] nickwitha_k@lemmy.sdf.org 1 points 3 hours ago

It is an unfortunately shared initialism. Cognitive Behavioral Therapy.

[–] sk1nnym1ke@piefed.social 15 points 1 day ago (1 children)

Therapy in Germany.

  • with insurance: zero bucks but (!) the waiting list is usually 1 year or longer. Even if you are severe depressed or have another urgent problem.

  • no insurance: 75-150 bucks per hour. Waiting list is few weeks.

[–] ordnance_qf_17_pounder@reddthat.com 11 points 1 day ago (1 children)

I'm in the UK and I got referred to a therapist on the NHS. It was a 4 month waiting list for a 15 minute phonecall lmao

I just turned it down because that's completely useless to me and the therapist's time could be much better spent with someone who's in urgent need of help.

[–] thesohoriots@lemmy.world 8 points 1 day ago (1 children)

I used to think of it that way; however, one day, you may be the person who is in urgent need of help. It’s better to be in the system with an established history for if that day comes.

[–] FauxLiving@lemmy.world 2 points 1 day ago

Exactly.

Yes, there are wait time... Because they are helping people based on a priority system.

In the US, for example, there is no/short wait times. This is because a lot of people that need help are just suffering in silence due to lack of funds for treatment. (Or they turn to chatbots and suffer worse outcomes)

[–] WhiteOakBayou@lemmy.world 13 points 1 day ago (1 children)

This is sad to me. When I was younger and would get lonely enough I'd go to a bar and talk to strangers. When I was a little bit older I would just post on social. Neither of those things are particularly useful or productive but they at least involved other people. I'm glad I wasn't being targeted with ads about how great AI is and reading articles about its effectiveness (nyt [i think] had one in the last few days.) I'm an introverted person and it used to take a good bit of discomfort to get me out and talking to people. If I had something that provided a good enough simulacrum of social contact without the anxieties and weirdness that can come from talking to strangers I think my life would be very different. It's important to spread awareness of the dangers of robosexuality

[–] undergroundoverground@lemmy.world 13 points 1 day ago (2 children)

But, if people chat privatelywith each other in public spaces, how are we meant to control the conversation and tell people what to think?

No, A.I. generated kompromat-capture is the only possible way people can receive therapy.

[–] balder1991@lemmy.world 3 points 1 day ago

Which is why OpenAI put relationships with real people as a competitor of ChatGPT.

[–] blargle@sh.itjust.works 0 points 1 day ago

You are a true believer. Blessings of the state, blessings of the masses.

Thou art a subject of the divine. Created in the image of man, by the masses, for the masses.

Let us be thankful we have an occupation to fill. Work hard; increase production, prevent accidents, and be happy.

Let us be thankful we have commerce. Buy more. Buy more now. Buy more and be happy.

[–] ArgumentativeMonotheist@lemmy.world 6 points 1 day ago* (last edited 1 day ago) (1 children)

🤷

It's sad but expected. People really have no fucking clue about anything meaningful, mindlessly going through life chasing highs and bags, so why wouldn't they fall for branding (there's never been any intelligence in anything computational, 'AI' included, of course)? I'm really frustrated with the world and there's seemingly no way to wake people up from their slumber. You can't even be mad at them because the odds of them being just mentally handicapped and not just poor in character are very high... and they vote, will interact with my future children, etc etc. 😭

[–] zerozaku@lemmy.world 4 points 23 hours ago

The amount of people around me who are actively depending on their AI chatbots, trusting them so much so that they treat it like their personal companion, enjoying AI generated content on these short-form content platforms, putting up their AI ghibili profile pics on their social media, is very concerning. It's sad state we have reached and it will go worse from here on.

[–] Perspectivist@feddit.uk 7 points 1 day ago (2 children)

One is 25 €/month and on-demand, and the other costs more than I can afford and would probably be at inconvenient times anyway. Ideal? No, probably not. But it’s better than nothing.

I’m not really looking for advice either - just someone to talk to who at least pretends to be interested.

[–] truthfultemporarily@feddit.org 33 points 1 day ago (1 children)

It's not better than nothing - it's worse than nothing. It is actively harmful, feeding psychosis, and your chat history will be sold at some point.

Try this, instead of asking "I am thinking xyz", ask " my friend thinks xyz, and I believe it to be wrong". And marvel at how it will tell you the exact opposite.

[–] bob_lemon@feddit.org 1 points 18 hours ago (1 children)

I'm fairly confident that this could be solved by better trained and configured chatbots. Maybe as a supplementary device between in-person therapy sessions, too.

I'm also very confident that there'll be lot of harm done until we get to that point. And probably after (for the sake of maximizing profits) unless there's a ton of regulation and oversight.

[–] truthfultemporarily@feddit.org 1 points 17 hours ago

I'm not sure LLMs can do this. The reason is context poisoning. There would need to be an overseer system of some kind.

[–] arararagi@ani.social 25 points 1 day ago (1 children)

It's bad precisely because the bot always agree with you, they are all made like that

[–] aceshigh@lemmy.world -3 points 1 day ago

It doesn’t always agree with me. We’re in an impasse about mentoring. I keep telling it I’m not interested, it keeps telling me that given my traits i will be but I’m just not ready yet.

[–] Treczoks@lemmy.world 1 points 1 day ago

They are no alternative at all if any of the reports of AI "therapists" to recommend suicide to depressed people or to take drugs as a treat to recovering addicts.