this post was submitted on 14 Sep 2025
15 points (94.1% liked)

TechTakes

2163 readers
56 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments
[–] Soyweiser@awful.systems 4 points 2 hours ago (4 children)

Was reading some science fiction from the 90's and the AI/AGI said 'im an analog computer, just like you, im actually really bad at math.' And I wonder how much damage these one of these ideas (the other being there are computer types that can do more/different things. Not sure if analog turing machines provide any new capabilities that digital TMs do, but I leave that question for the smarter people in the subject of theorethical computer science) did.

The idea that a smart computer will be worse at math (which makes sense from a storytelling perspective as a writer, because smart AI who also can do math super well is gonna be hard to write), which now leads people who read enough science fiction to see the machine that can't count nor run doom and go 'this is what they predicted!'.

Not a sneer just a random thought.

[–] froztbyte@awful.systems 2 points 1 hour ago* (last edited 1 hour ago)

this is one of those things that's, in a narrative sense, a great way to tell a story, while being completely untethered from fact/reality. and that's fine! stories have no obligation to be based in fact!

to put a very mild armchair analysis about it forward: it's playing on the definition of the conceptual "smart" computer, as it relates to human experience. there's been a couple of other things in recent history that I can think of that hit similar or related notes (M3GAN, the whole "omg the AI tricked us (and then the different species with a different neurotype and capability noticed it!)" arc in ST:DIS, the last few Mission Impossible films, etc). it's one of those ways in which art and stories tend to express "grappling with $x to make sense of it"

The idea that a smart computer will be worse at math (which makes sense from a storytelling perspective as a writer, because smart AI who also can do math super well is gonna be hard to write)

personally speaking, one of the ways about it that I find most jarring is when the fantastical vastly outweighs anything else purely for narrative reasons - so much so that it's a 4th-wallbreak for me ito what the story means to convey. I reflect on this somewhat regularly, as it's a rather cursed rabbithole that instances repeatedly: "is it my knowledge of this domain that's spoiling my enjoyment of this thing, or is the story simply badly written?" is the question that comes up, and it's surprisingly varied and complicated in its answering

on the whole I think it's often good/best to keep in mind that scifi is often an exploration and a pressure valve, but that it's also worth keeping an eye on how much it's a pressure valve. too much of the latter, and something(tm) is up

[–] swlabr@awful.systems 4 points 1 hour ago* (last edited 1 hour ago) (1 children)

AGI: I'm not a superintelligence, I'm you.

[–] Soyweiser@awful.systems 3 points 1 hour ago

My im not a witch shirt ...

[–] BlueMonday1984@awful.systems 2 points 1 hour ago (1 children)

This isn't an idea that I've heard of until you mentioned it, so it likely hasn't got much purchase in the public consciousness. (Intuitively speaking, a computer which sucks at maths isn't a good computer, let alone AGI material.)

[–] Soyweiser@awful.systems 1 points 1 hour ago* (last edited 1 hour ago)

Yeah, I was also just wondering, as obv what I read is not really typical of the average public. Can't think of any place where this idea spread in non-written science fiction for example, with an exception being the predictions of C-3PO, who always seems to be wrong. But he is intended as a comedic sidekick. (him being wrong can also be seen as just the lack of value in calculating odds like that, esp in a universe with The Force).

But yes, not likely to be a big thing indeed.