~~In the dead of night, knocking on the door~~
zogwarg
My feeling has gotten that I prefer the business executive empty vs the LLM empty, at least the first one usually expresses personality. It's never entirely empty.
Screaming at the void towards Chuunibyou (wiki) Eliezer: YOU ARE NOT A NOVEL CHARACTER, THINKING OF WHAT BENEFITS THE NOVELIST vs THE CHARACTER HAS NO BEARING ON REAL LIFE.
Sorry for yelling.
Minor notes:
But thinks I should say it, so I will say it. [...] asked me to speak them anyways, so I will.
It's quite petty of Yud to be so passive-aggressive towards his employee insisted he at least try to discuss coping. Name dropping him not once but twice (although that is also likely to just be poor editing)
"How are you coping with the end of the world?" [...Blah...Blah...Spiel about going mad tropes...]
Yud, when journalists ask you "How are you coping?", they don't expect you to be "going mad facing apocalypse", that is YOUR poor imagination as a writer/empathetic person. They expect you to be answering how you are managing your emotions and your stress, or bar that give a message of hope or of some desperation, they are trying to engage with you as real human being, not as a novel character.
Alternatively it's also a question to gauge how full of shit you may be. (By gauging how emotionally invested you are)
The trope of somebody going insane as the world ends, does not appeal to me as an author, including in my role as the author of my own life. It seems obvious, cliche, predictable, and contrary to the ideals of writing intelligent characters. Nothing about it seems fresh or interesting. It doesn't tempt me to write, and it doesn't tempt me to be.
Emotional turmoil and how characters cope, or fail to cope makes excellent literature! That all you can think of is "going mad", reflects only your poor imagination as both a writer and a reader.
I predict, because to them I am the subject of the story and it has not occurred to them that there's a whole planet out there too to be the story-subject.
This is only true if they actually accept the premise of what you are trying to sell them.
[...] I was rolling my eyes about how they'd now found a new way of being the story's subject.
That is deeply Ironic, coming from someone who makes choice based on him being the main character of a novel.
Besides being a thing I can just decide, my decision to stay sane is also something that I implement by not writing an expectation of future insanity into my internal script / pseudo-predictive sort-of-world-model that instead connects to motor output.
If you are truly doing this, I would say that means you are expecting insanity wayyyyy to much. (also psychobabble)
[...Too painful to actually quote psychobabble about getting out of bed in the morning...]
In which Yud goes in depth, and self-aggrandizing nonsensical detail about a very mundane trick about getting out of bed in the morning.
A fairly good and nuanced guide. No magic silver-bullet shibboleths for us.
I particularly like this section:
Consequently, the LLM tends to omit specific, unusual, nuanced facts (which are statistically rare) and replace them with more generic, positive descriptions (which are statistically common). Thus the highly specific "inventor of the first train-coupling device" might become "a revolutionary titan of industry." It is like shouting louder and louder that a portrait shows a uniquely important person, while the portrait itself is fading from a sharp photograph into a blurry, generic sketch. The subject becomes simultaneously less specific and more exaggerated.
I think it's an excellent summary, and connects with the "Barnum-effect" of LLMs, making them appear smarter than they are. And that it's not the presence of certain words, but the absence of certain others (and well content) that is a good indicator of LLM extruded garbage.
I guess my P(Doom|Bathroom) should have been higher.
Pessimistically I think this scourge will be with us for as long as there are people willing to put code "that-mostly-works" in production. It won't be making decisions, but we'll get a new faucet of poor code sludge to enjoy and repair.
In French, ChatGPT sounds like « Chatte, j'ai pété » meaning "Pussy, I farted".
The power, of words:
Is all but naught, if not heard.
And a bot, cannot.
Of course! It's to know less and less, until truly, the only thing they know is that they know nothing.
It's clearly meant to mean /HalleluJah
To be fair though it's not just their brains turning to mush, google has genuinely been getting worse too.
A get-smart-quick scheme, with an absurd amount of occult cargo-culting of scientific discourse.