The only thing that is allowed to tell good art from slop is the AI which needs to consume good art and not slop.
diz
This is what peak altruism looks like: being a lazy fuck with a cult, and incidentally happening to help hype up investments into the very unfriendly AI you're supposed to save the world from. All while being too lazy to learn anything about any actual AI technologies.
In all seriousness, all of his stuff is just extreme narcissism. Altruism is good, therefore he's the most altruistic person in the world. Smart is good, therefore he's the mostest smartest person. Their whole cult can be derived entirely from such self serving axioms.
Its spelled “masterdebating”.
Ironically, in a videogame someone like Musk would always be at most an NPC, and possibly not even that (just a set of old newspaper clippings / terminal entries in fallout / etc). Yudkowsky would be just a background story for explaining some fucked up cult.
This is because they are, ultimately, uninteresting to simulate - their lives are well documented and devoid of any genuine challenge (they just get things by selection bias rather than any effort - simulating then is like simulating a lottery winner rather than a lottery). They exist to set up the scene for something interesting.
I think the question of "general intelligence" is kind of a red herring. Evolution for example creates extremely complex organisms and behaviors, all without any "general intelligence" working towards some overarching goal.
The other issue with Yudkowsky is that he's an unimaginative fool whose only source of insights on the topic is science fiction, which he doesn't even understand. There is no fun in having Skynet start a nuclear war and then itself perish in the aftermath, as the power plants it depend on cease working.
Humanity itself doesn't possess that kind of intelligence envisioned for "AGI". When it comes to science and technology, we are all powerful hivemind. When it comes to deciding what to do with said science and technology, we are no more intelligent than an amoeba, crawling along a gradient.
I don't think the quantum hype has much to do with quantum mechanics. It is a people phenomenon.
In the times past, people who understand that stuff would be comfortably living the American dream and not pursuing some grifts. There would be a relatively sharp distinction between grifters and non grifters.
With increased social stratification that time is long gone; unless you're part of 0.01% , however qualified you won't feel financially secure enough to not go along with the flow set by the money guys. And they are a lot less interested in listening; they are important people and the pay gap between them and a physicist is larger than the gap between the ceo and a part time cleaner used to be.
The money guys on the other hand believe that they can just make things happen when they want to by pouring money into it and do not believe that details are important. In a sense they are right, because a lot of them do profit off pouring money at things that can't ultimately pan out, but which could be bought by a large corporation, using other people's money (then the ceo of said large corporation goes on to run their own startup).
Then also the time of rapid growth for the software and electronics industry was obviously coming to a close, but nobody with money got any other ideas so they will push it as far as they can. That drives the hype bubbles.
To argue by analogy, it’s not like getting an artificial feather exactly right was ever a bottleneck to developing air travel once we got the basics of aerodynamics down.
I suspect that "artificial intelligence" may be a bit more like making an artificial bird that self replicates, with computers and AI as it exists now being somewhere in-between thrown rocks and gliders.
We only ever "beat" biology by cheating via removing a core requirement of self replication. An airplane factory that has to scavenge for all the rare elements involved in making a turbine, would never fly. We had never actually beaten biology. Supersonic aircraft may be closer to a rock thrown off the cliff than to surpassing biology.
That "cheat code" shouldn't be expected to apply to skynet or ASI or whatever, because skynet is presumably capable of self replication. Would be pretty odd if "ASI" would be the first thing that we actually beat biology on.
The thing about synapses etc argument is that the hype crowd argues that perhaps the AI could wind up doing something much more effective than what-ever-it-is-that-real-brains-do.
If you look at capabilities, however, it is inarguable that "artificial neurons" seem intrinsically a lot less effective than real ones, if we consider small animals (like e.g. a jumping spider or a bee, or even a roundworm).
It is a rather unusual situation. When it comes to things like e.g. converting chemical energy to mechanical energy, we did not have to fully understand and copy muscles to be able to build a steam engine that has higher mechanical power output than you could get out of an elephant. That was the case for arithmetic, too, and hence there was this expectation of imminent AI in the 1960s.
I think it boils down to intelligence being a very specific thing evolved for a specific purpose, less like "moving underwater from point A to point B" (which submarine does pretty well) and more like "fish doing what fish do". The submarine represents very little progress towards fishiness.
Hyping up AI is bad, so it’s alright to call someone a promptfondler for fondling prompt.
I mostly see "clanker" in reference to products of particularly asinine promptfondling: spambot "agents" that post and even respond to comments, LLM-based scam calls, call center replacement, etc.
These bots don't derive their wrongness from the wrongness of promptfondling, these things are part of why promptfondling is wrong.
Doesn’t clanker come from some Star Wars thing where they use it like a racial slur against robots, who are basically sapient things with feelings within its fiction? Being based on “cracker” would be alright,
I assume the writers wanted to portray the robots as unfairly oppressed, while simultaneously not trivializing actual oppression of actual people (the way "wireback" would have, or I dunno "cogger" or something).
but the way I see it used is mostly white people LARPing a time and place when they could say the N-word with impunity.
Well yeah that would indeed be racist.
I’m seeing a lot of people basically going “I hate naggers, these naggers are ruining the neighborhood, go to the back of the bus nagger, let’s go lynch that nagger” and thinking that’s funny because haha it’s not the bad word technically.
That just seems like an instance of good ol anti person racism / people trying to offend other people while not particularly giving a shit about the bots one way or the other.
we should recognize the difference
The what now? You don't think there's a lot of homophobia that follows "castigating someone for what they do" format, or you think its a lot less bad according to some siskinded definition of what makes slurs bad that somehow manages to completely ignore anything that actually makes slurs bad?
I think that’s the difference between “promptfondler” and “clanker”. The latter is clearly inspired by bigoted slurs.
Such as... "cracker"? Given how the law protects but doesn't bind AI, that seems oddly spot on.
Note also that genuine labor saving stuff like say the Unity engine with Unity asset store, did result in an absolute flood of shovelware on Steam back in the mid 2010s (although that probably had as much having to do with Steam FOMO-ing about the possibility of not letting the next Minecraft onto Steam).
As a thought experiment imagine an unreliable labor saving tool that speeds up half* of the work 20x, and slows down the other half 3x. You would end up 1.525 times slower.
The fraction of work (not by lines but by hours) that AI helps with is probably less than 50% , and the speed up is probably worse than 20x.
Slowdown could be due to some combination of
- Trying to do it with AI until you sink too much time into that and then doing it yourself (>2x slowdown here).
- Being slower at working with the code you didn't write.
- It being much harder to debug code you didn't write.
- Plagiarism being inferior to using open source libraries.
footnote: "half" as measured by the pre-tool hours.
I don't think getting rid of elections would work. Dictatorships do not rely on election rigging alone. That's just interventionist propaganda (barge in, set up elections, presto, democracy).
Competent dictators don't act anything like Trump. Once in power they try to obtain support of as wide of a section of the population as possible. There's no freedom of speech in a dictatorship; the dictator is giving prepared speeches, designed to bolster his support, to unify the nation, etc, not just having fun gloating at half the nation's expense.
If he actually tries to maintain power despite his relative unpopularity, the consequences will be utterly disastrous.