this post was submitted on 27 Jul 2025
90 points (97.9% liked)

Technology

666 readers
3 users here now

Tech related news and discussion. Link to anything, it doesn't need to be a news article.

Let's keep the politics and business side of things to a minimum.

Rules

No memes

founded 2 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Ava@lemmy.blahaj.zone 23 points 1 week ago (4 children)

I mean... Yeah, no shit. It's not therapy, and your chatbot shouldn't be pretending to offer professional services that require a license, Sam.

Let's be truthful. You don't want to have to explain or justify anything that your chatbot says, and you don't want the Courts to be able to analyze whether you've violated any rights or laws either.

[–] Perspectivist@feddit.uk 3 points 1 week ago* (last edited 1 week ago) (3 children)

your chatbot shouldn’t be pretending to offer professional services that require a license, Sam.

It generates natural sounding language. That's all it's designed to do. The rest is up to the user - if a therapy session is what they ask then a therapy session is what they get. I don't think it should refuse this request either.

[–] hendrik@palaver.p3x.de 5 points 1 week ago

I mean there is some valid discussion going on whether some vulnerable people need protection. Generally I agree. But due to it's nature as a yes-man, ChatGPT will feel nice and give a lot of reaffirmation, which has the potential to mess up some people real bad and send them spiralling down. So they might indeed need some form of protection. But that shouldn't take anything away.

load more comments (2 replies)
load more comments (2 replies)