this post was submitted on 19 Feb 2026
382 points (94.6% liked)

Showerthoughts

40620 readers
942 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.

Here are some examples to inspire your own showerthoughts:

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. No politics
    • If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
    • A good place for politics is c/politicaldiscussion
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct and the TOS

If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.

Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.

founded 2 years ago
MODERATORS
 

To go deeper: some animals act curiously, others with fear, but only a few of them understand what the mirror does and use it to inspect themselves.

top 50 comments
sorted by: hot top controversial new old
[–] CovfefeKills@lemmy.world 2 points 10 hours ago

Deep thoughts

[–] Horsecook@sh.itjust.works 13 points 16 hours ago (3 children)

There’s been an extensive marketing campaign to convince people that LLMs are intelligent. I wouldn’t call someone a subhuman for assuming there is some truth to that.

Of those that understand what an LLM is, I think you can divide them into two groups, the honest, and the dishonest. Honest people see no use in a bullshit generator, a lying machine. They see it as a perversion of technology. Dishonest people have no such objection. They might even truly see intelligence in the machine, as its outputs don’t differ substantially from their own. If you view language as a means to get what you want, rather than a means to convey factual information, then lying is acceptable, desirable, intelligent. It would be difficult for such a person to differentiate between coherent but meaningless bullshit, and a machine with agency making false statements to pursue its own goals.

[–] certified_expert@lemmy.world 4 points 9 hours ago (1 children)

I disagree about the dichotomy. I think you can (1) understand what LLMs actually are. (2) See the value of such technology.

In both cases being factual (not being deceived) and not being malicious (not attempting to deceive others)

I think a reasonable use of these tools is as a "sidekick" (you being the main character). Some tasks can be assigned to it so you save some time, but the thinking and the actual mental model of what is being done shall always be your responsibility.

For example, LLMs are good as an interface to quickly lookup within manuals, books, clarify specific concepts, or find the proper terms for a vague idea (so that you can research the topic using the appropriate terms)

Of course, this is just an opinion. 100% open to discussion.

[–] BanMe@lemmy.world 1 points 44 minutes ago

I think of it like a nonhuman character, like a character in a book I'm reading. Is it real? No. Is it compelling? Yes. Do I know exactly what it'll do next? No. Is it serving a purpose in my life? Yes.

It effectively attends to my requests and even feelings but I do not reciprocate that. I've got decades of sci-fi leading me up to this point, the idea of interacting with humanoid robots or AI has been around since my childhood, but it's never involved attending to the machine's feelings or needs.

We need to sort out the boundaries on this, the delusional people who are having "relationships" with AI, getting a social or other emotional fix from it. But that doesn't mean we have to categorize anyone who uses it as moronic. It's a tool.

[–] naught101@lemmy.world 1 points 11 hours ago

Marketing is a valid use for AI (because bullshit was always thewod anyway)

[–] Etterra@discuss.online 3 points 15 hours ago

Wait, let's hear OP out.

[–] minnow@lemmy.world 61 points 22 hours ago (1 children)

The mirror test is frequently cited as a means of testing sentience.

OP I think you hit the nail on the head.

[–] Aerosol3215@piefed.ca 7 points 17 hours ago (1 children)

Based on the fact that most people don't see their interaction with the LLM as gazing into the mirror, am I being led to believe that most people are not sentient???

[–] Zorque@lemmy.world 12 points 17 hours ago

Based entirely on the opinions of people on niche social media platforms, yes.

[–] schnurrito@discuss.tchncs.de 46 points 21 hours ago (2 children)

Except it's not my reflection, it's a reflection of millions if not billions of humans.

[–] Carnelian@lemmy.world 24 points 20 hours ago

Except it’s not their reflection, it’s a string of phrases presented to you based partly on the commonality of similar phrases appearing next to one another in the training data, and partly on mysterious black box modifications! Fun!

[–] ameancow@lemmy.world 0 points 18 hours ago* (last edited 16 hours ago)

I like to describe it as a "force multiplier" along the lines of a powered suit.

You are putting in small inputs, and it's echoing out in a vast, vast virtual space and being compared and connected with countless billions of possible associations. What you get back is a kind of amplification of what you put in. If you make even remotely leading suggestions in your question or prompt, that tiny suggestion is also going to get massively boosted in the background, this is part of why some LLM's can go off the rails with some users. If you don't take care with what exactly you're putting in, you will get wildly unexpected results.

also, it's devil tech so there's that.

[–] callyral@pawb.social 17 points 18 hours ago (1 children)

Related: is there a name for "question bias"?

Like asking ChatGPT if "is x good?", and it would reply "Yes, x is good." but if you ask "is x bad?" it would reply "Yes, x is bad, you're right."

[–] TheOctonaut@mander.xyz 16 points 18 hours ago (1 children)

It's just a leading question.

[–] yeahiknow3@lemmy.dbzer0.com 6 points 17 hours ago* (last edited 17 hours ago)

It is not a leading question. The answer just happens to be meaningless.

Asking whether something is good is the vast majority of human concern. Most of our rational activity is fundamentally evaluative.

[–] Lost_My_Mind@lemmy.world 7 points 17 hours ago (1 children)

Huh.....so what you're saying is that mirrors are actually AI.

THAT MAKES A LOT OF SENSE!!! EVERYBODY COVER YOUR MIRRORS!!!

Laughs in vampire

[–] truthfultemporarily@feddit.org 19 points 21 hours ago

Just think about the fact llms are basically trying to simulate reddit posts and then think again about using them.

[–] ameancow@lemmy.world 8 points 18 hours ago

Not nearly enough people understand this about our current models of AI. Even people who think they understand AI don't understand this, usually because they have been talking to themselves a lot without realizing it.

[–] Sunschein@piefed.social 6 points 18 hours ago (1 children)
[–] SchwertImStein@lemmy.dbzer0.com 3 points 17 hours ago (1 children)
[–] Sunschein@piefed.social 1 points 58 minutes ago

Yeah. Figured it was a good visual representation of seeing an AI version of ourselves in a mirror.

[–] GuyIncognito@lemmy.ca 11 points 21 hours ago (1 children)

I checked with that other gorilla who lives in the bathroom and he says you're wrong

[–] certified_expert@lemmy.world 4 points 19 hours ago (1 children)

lol, Is that the same gorilla that you see in other bathrooms? Or (like me) you meet a new gorilla every time you wash your hands?

[–] GuyIncognito@lemmy.ca 5 points 19 hours ago

I think he's the same guy. I used to try to bust him up but he just kept multiplying into more pieces and then coming back whole every time I saw a new mirror, so I eventually gave up

[–] lowspeedchase@lemmy.dbzer0.com 11 points 22 hours ago (2 children)

This is a great one - although I never see animals worshipping the mirror.

[–] Rippin_Farts_And_Or_Breaking_Hearts@lemmy.org 7 points 21 hours ago (3 children)

I've got a duck that prefers to dance in front of a chrome bumper or glass door where he can see his reflection than to go after any potential mates. Possibly he's worshipping the mirror. Possibly he's just really vain.

[–] gravitas_deficiency@sh.itjust.works 4 points 19 hours ago (1 children)

Sounds like he’s ducking handsome

[–] Rippin_Farts_And_Or_Breaking_Hearts@lemmy.org 3 points 17 hours ago* (last edited 17 hours ago)

He is actually. When he washes himself he's blinding white. And when he dances he gets a little feather pompadour on the top of his head.

[–] lowspeedchase@lemmy.dbzer0.com 5 points 21 hours ago

Nothing wrong with a handsome duck taking a little self affirmation time - he knows his value, he can't look away.

[–] gravitas_deficiency@sh.itjust.works 2 points 19 hours ago (3 children)

Or forming romantic attachments to the mirror

[–] Wilco@lemmy.zip 3 points 17 hours ago

Uhmm ... you never had a pet bird Im guessing?

Seeing all bird masturbate up against a mirror is just par for the course when you have bird pets. Its gonna be either a mirror, a favorite toy ... or you.

[–] ameancow@lemmy.world 3 points 18 hours ago* (last edited 18 hours ago)

Animals aren't cursed with the human ability to think our way into harmful and unproductive behavior due to conscious re-interpretation of information around us. Except for occasional zoo-animals in captivity that fall in love with inanimate objects.

Something something about our species basically being in captivity.

[–] lowspeedchase@lemmy.dbzer0.com 1 points 18 hours ago

Ooofff... Good call

[–] Abyssian@lemmy.world -1 points 10 hours ago (2 children)

Except when you leave several LLMs able to communicate with one another they will, on their own, with no instructions, including creating their own unique social norms.

https://neurosciencenews.com/ai-llm-social-norms-28928/

[–] certified_expert@lemmy.world 7 points 8 hours ago

This is nothing else than the reflexion I am talking about. It is not a reflexion of you, the person chatting with the bot, but an "average" reflexion of what humanity has expressed in the data llms have been trained on.

If a mirror is placed in front of another mirror, the "infinite tunnel" only exists in the mind of the observer.

[–] lost_faith@lemmy.ca 2 points 19 hours ago

And here I am practising my smile in the mirror (like that golden retriever)

[–] lemmie689@lemmy.sdf.org 2 points 21 hours ago

My dog doesn't pay any attention to mirrors, or llms.

[–] Supervisor194@lemmy.world 1 points 18 hours ago (1 children)

False. My reflection can't tell me that pressing the Steam button and X will bring up the keyboard on Steam Deck's desktop mode.

[–] lennee@lemmy.world 5 points 18 hours ago

pressing and holding the steam button tells u every steam shortcut

[–] woop_woop@lemmy.world 2 points 21 hours ago (1 children)

If I understand your statement correctly, only the most intelligent creatures would understand that LLM's are themselves?

[–] certified_expert@lemmy.world 1 points 19 hours ago

But they are a reflexion of ourselves. If you look at the algebra and stats underneath, you'll realize that they spit out whatever it is in us

[–] flandish@lemmy.world 1 points 21 hours ago (1 children)

the “mirror stage” by lacan is worth looking into here but no, I don’t think humans automatically think the llm has a sort of reified other, but as we get past an uncanny valley and into generations growing up with entire personal histories in an llm - I can absolutely see that happening.

[–] Rhaedas@fedia.io 4 points 21 hours ago (1 children)

I think most people using them know that there's not something there, and yet when using it they'll still act as there is. Even giving it that benefit of doubt in what it outputs as valid, as if it's from another person, maybe even one who knows more than they do. So it's a gray area of "not believing".

[–] SuDmit@lemmy.blahaj.zone 1 points 12 hours ago

Continuing on "there's not something there", what about theater analogy to llm's? Usually people know it's all act, but if actors play too good, you immerse. And it's focus is not the truth, it tells the story.

[–] CIA_chatbot@lemmy.world 1 points 21 hours ago (1 children)

I find this kind of Anti AI Sentience bigotry horrible!

[–] certified_expert@lemmy.world 1 points 19 hours ago (1 children)

Interesting take. Could you elaborate?

My post comes from the study of the algebra and stats that enable LLMs (well, part of it. i am not done with the "attention".

[–] CIA_chatbot@lemmy.world 1 points 19 hours ago

I was making a joke based on my username