Even for the people that do get email notifications of Zitron's excellent content (like myself), I appreciate having a place here to discuss it.
scruiser
j/k he’s doubling down on being a dick.
I had kind of gotten my hopes up from the comparisons of him to sneerclub that maybe he'd be funny or incisively cutting or something, but it looks mostly like typical lesswrong pedantry, just less awkwardly straining to be charitable (to the in-group).
Apparently Eliezer is actually against throwing around P(doom) numbers: https://www.lesswrong.com/posts/4mBaixwf4k8jk7fG4/yudkowsky-on-don-t-use-p-doom ?
The objections to using P(doom) are relatively reasonable by lesswrong standards... but this is in fact once again all Eliezer's fault. He started a community centered around 1) putting overconfident probability "estimates" on subjective uncertain things 2) need to make a friendly AI-God, he really shouldn't be surprised that people combine the two. Also, he has regularly expressed his certainty that we are all going to die to Skynet in terms of ridiculously overconfident probabilities, he shouldn't be surprised that other people followed suit.
Guns don't kill people, people kill people.
Lesswrong and SSC: capable of extreme steelmanning of... check notes... occult mysticism (including divinatory magic), Zen-Buddhism based cults, people who think we should end democracy and have kings instead, Richard Lynn, Charles Murray, Chris Langan, techbros creating AI they think is literally going to cause mankind's extinction...
Not capable of even a cursory glance into their statements, much less steelmanning: sneerclub, Occupy Wallstreet
we cant do basic things
That's giving them too much credit! They've generated the raw material for all the marketing copy and jargon pumped out by the LLM companies producing the very thing they think will doom us all! They've served a small but crucial role in the influence farming of the likes of Peter Thiel and Elon Musk. They've served as an entry point to the alt-right pipeline!
dath ilan?
As a self-certified Eliezer understander, I can tell you dath ilan would open up a micro-prediction market on various counterfactual ban durations. Somehow this prediction market would work excellently despite a lack of liquidity and multiple layers of skewed incentives that should outweigh any money going into it. Also Said would have been sent to a ~~reeducation camp~~, quiet city and ~~sterilized~~ denied UBI if he reproduces for not conforming to dath ilan's norms much earlier.
That too.
And judging by how all the elegantly charitably written blog posts on the EA forums did jack shit to stop the second manifest conference from having even more racists, debate really doesn't help.
Yes, thanks. I always forget how many enters i need to hit.
I'm feeling an effort sneer...
For roughly equally long have I spent around one hundred hours almost every year trying to get Said Achmiz to understand and learn how to become a good LessWrong commenter by my lights.
Every time I read about a case like this my conviction grows that sneerclub's vibe based moderation is the far superior method!
The key component of making good sneer club criticism is to never actually say out loud what your problem is.
We've said it multiple times, it's just a long list that is inconvenient to say all at once. The major things that keep coming up: The cult shit (including the promise of infinite AGI God heaven and infinite Roko's Basilisk hell; and including forming high demand groups motivated by said heaven/hell); the racist shit (including the eugenics shit); the pretentious shit (I could actually tolerate that if it didn't have the other parts); and lately serving as crit-hype marketing for really damaging technology!
They don't need to develop protocols of communication that produce functional outcomes
Ahem... you just admitted to taking a hundred hours to ban someone, whereas dgerad and co kick out multiple troublemakers in our community within a few hours tops each. I think we are winning on this one.
For LessWrong to become a place that can't do much but to tear things down.
I've seen some outright blatant crank shit (as opposed to the crank shit that works hard to masquerade as more legitimate science) pretty highly upvoted and commented positively on lesswrong (GeneSmith's wild genetic engineering fantasies come to mind).
I missed that it’s also explicitly meant as rationalist esoterica.
It turns in that direction about 20ish pages in... and spends hundreds of pages on it, greatly inflating the length from what could be a much more readable length. It then gets back to actual plot events after that.
I hadn't heard of MAPLE before, is it tied to lesswrong? From the focus on AI it's at least adjacent to it... so I'll add that to the list of cults lesswrong is responsible for. So all in all, we've got the Zizians, Leverage Research, and now Maple for proper cults, and stuff like Dragon Army and Michael Vassar's groupies for "high demand" groups. It really is a cult incubator.
It's a nice master post that gets all his responses and many useful articles linked into one place. It's all familiar if you've kept up with techtakes and Zitron's other posts and pivot-to-ai, but I found a few articles I had previously missed reading.
Related trend to all the but achskhually's AI booster's like to throw out. Has everyone else noticed the trend where someone makes a claim of a rumor they heard about an LLM making a genuine discovery in some science, except it's always repeated second hand so you can't really evaluate it, and in the rare cases they do have a link to the source, it's always much less impressive than they made it sound at first...