this post was submitted on 28 Mar 2026
375 points (97.2% liked)

Technology

83295 readers
4581 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] atrielienz@lemmy.world 3 points 3 days ago (1 children)

Think about the people you willingly surround yourself with. Then think about how often they agree with the things you think and say.

As the saying goes "I'm sure there's someone out there who believes the exact opposite of everything I believe, and while I'm sure they aren't a complete idiot..."

Everyone is susceptible to the feedback loop. Everyone can fall victim to the seduction of an echo chamber. While not everyone would ignore the red flag that this thing is a machine/digital algorithm rather than a person or sentient/sapient being, it's not really that hard to see how we got here. Echo chambers exist all over the internet. The difference is that most of them have some voices of dissent. The AI LLM doesn't offer that. They keep trying to add it in but it's basically antithetical to the design.

When you add that to the fact that making it addictive benefits their bottom line is pretty obvious that they are trying to walk the line between being regulated by the government and making their product as popular as possible.

I don't think they really knew it would have this exact effect. But I do think they plan to take advantage of it now that they know and I don't think we humans are all going to be able to fight the temptation of an automated propaganda machine.

This is especially because mental health and healthcare in this country has been failing for decades, and even people who "don't have mental health problems" aren't magically mentally healthy, they just don't know the status of their mental health. A whole lot of people in the US especially are mentally ill or facing neurological medical problems that they don't know about.

[–] Kuma@lemmy.world 2 points 2 days ago

Sounds to me like it's mostly about luck whether you fall into that hole or not, or a lot of people would rather believe in something even though they know it isn't true or the chance is extremely low, like trying to win the lottery.

I've never met ppl irl who see LLMs as more than a digital tool that can be wrong (at least not to my knowledge), so that's why it's hard for me to understand (because I haven't been able to ask). I understand it can be nice to be heard, but to me an LLM is very hollow, there is no experience behind its answers and you can tell it doesn't care or try to understand (also why I do not understand the attachment). I actually get more frustrated than happy when it says empty stuff like "you've got good instincts!", doesn't challenge me at all in my decisions/statements (even when I ask it to), or when I ask for inspiration (its creativity is extremely lacking). I feel the same about ppl if I think they aren't trying to understand and just give me empty replies, like a salesperson reading from a script.

So that's mostly why it's hard for me to understand, even though I know mental health and loneliness is a big part of it. I still don't understand why people can feel attached to LLMs and go so far for/with it. Echo chambers with actual ppl are a lot more understandable, that makes sense to me. LLMs do not.