this post was submitted on 07 Sep 2025
22 points (95.8% liked)

TechTakes

2163 readers
59 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments
[–] TinyTimmyTokyo@awful.systems 9 points 4 days ago (16 children)

Now that his new book is out, Big Yud is on the interview circuit. I hope everyone is prepared for a lot of annoying articles in the next few weeks.

Today he was on the Hard Fork podcast with Kevin Roose and Casey Newton (didn't listen to it yet). There's also a milquetoast profile in the NYT written by Kevin Roose, where Roose admits his P(doom) is between 5 and 10 percent.

[–] Architeuthis@awful.systems 10 points 4 days ago* (last edited 4 days ago) (14 children)

Siskind did a review too, basically gives it the 'their hearts in the right place but... [read AI2027 instead]' treatment. Then they go at it a bit with Yud in the comments where Yud comes off as a bitter dick, but their actual disagreements are just filioque shit. Also they both seem to agree that a worldwide moratorium on AI research that will give us time to breed/genetically engineer superior brained humans to fix our shit is the way to go.

https://www.astralcodexten.com/p/book-review-if-anyone-builds-it-everyone/comment/154920454

https://www.astralcodexten.com/p/book-review-if-anyone-builds-it-everyone/comment/154927504

Also notable that apparently Siskind thinks nuclear non-proliferation sorta worked because people talked it out and decided to be mature about it rather than being scared shitless of MAD, so AI non-proliferation by presumably appointing a rationalist Grand Inquisitor in charge of all human scientific progress is an obvious solution.

[–] TinyTimmyTokyo@awful.systems 7 points 3 days ago (2 children)

Yud: "That's not going to asymptote to a great final answer if you just run them for longer."

Asymptote is a noun, you git. I know in the grand scheme of things this is a trivial thing to be annoyed by, but what is it it with Yud's weird tendency to verbify nouns? Most rationalists seem to emulate him on this. It's like a cult signifier.

[–] zogwarg@awful.systems 6 points 3 days ago

It's also inherently-begging-the-question-silly, like it assumes that the Ideal of Alignment™, can never be reached but only approached. (I verb nouns quite often so I have to be more picky at what I get annoyed at)

[–] saucerwizard@awful.systems 3 points 3 days ago (3 children)

They think Yud is a world-historical intellect (I’ve seen claims on twitter he has a iq of 190 - yeah really) and by emulation a little of the old smartness can rub off on them.

[–] Soyweiser@awful.systems 3 points 3 days ago (1 children)

The normal max if an iq test is ~160 and from what I can tell nobody tests above it basically because it is not relevant. (And I assume testing problems and variance become to big statistical problems at this level). Not even sure how rare a 190 iq would be statistically, prob laughably rare.

[–] saucerwizard@awful.systems 3 points 2 days ago (1 children)

I don’t think these people have a good handle on how stuff actually works.

[–] Soyweiser@awful.systems 1 points 2 days ago

For a snicker I looked it up: https://iqcomparisonsite.com/iqtable.aspx

One in 100 million. So he would be in the top 80 smartest people alive right now. Which includes third world, children, elderly etc.

[–] Architeuthis@awful.systems 2 points 2 days ago* (last edited 2 days ago)

190IQ is when you verb asymptote to avoid saying 'almost'.

[–] fullsquare@awful.systems 1 points 2 days ago* (last edited 2 days ago) (1 children)

that's in practical terms meaningless, but just looking at statistics of it iq on the order of 190 would mean 1 in billion (1E9) per ever reliable rationalwiki https://rationalwiki.org/wiki/High_IQ_society

[–] Architeuthis@awful.systems 2 points 2 days ago

It's possible someone specifically picked the highest IQ that wouldn't need a second planet earth to make the statistics work.

load more comments (11 replies)
load more comments (12 replies)