Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)
Was reading some science fiction from the 90's and the AI/AGI said 'im an analog computer, just like you, im actually really bad at math.' And I wonder how much damage these one of these ideas (the other being there are computer types that can do more/different things. Not sure if analog turing machines provide any new capabilities that digital TMs do, but I leave that question for the smarter people in the subject of theorethical computer science) did.
The idea that a smart computer will be worse at math (which makes sense from a storytelling perspective as a writer, because smart AI who also can do math super well is gonna be hard to write), which now leads people who read enough science fiction to see the machine that can't count nor run doom and go 'this is what they predicted!'.
Not a sneer just a random thought.
this is one of those things that's, in a narrative sense, a great way to tell a story, while being completely untethered from fact/reality. and that's fine! stories have no obligation to be based in fact!
to put a very mild armchair analysis about it forward: it's playing on the definition of the conceptual "smart" computer, as it relates to human experience. there's been a couple of other things in recent history that I can think of that hit similar or related notes (M3GAN, the whole "omg the AI tricked us (and then the different species with a different neurotype and capability noticed it!)" arc in ST:DIS, the last few Mission Impossible films, etc). it's one of those ways in which art and stories tend to express "grappling with $x to make sense of it"
personally speaking, one of the ways about it that I find most jarring is when the fantastical vastly outweighs anything else purely for narrative reasons - so much so that it's a 4th-wallbreak for me ito what the story means to convey. I reflect on this somewhat regularly, as it's a rather cursed rabbithole that instances repeatedly: "is it my knowledge of this domain that's spoiling my enjoyment of this thing, or is the story simply badly written?" is the question that comes up, and it's surprisingly varied and complicated in its answering
on the whole I think it's often good/best to keep in mind that scifi is often an exploration and a pressure valve, but that it's also worth keeping an eye on how much it's a pressure valve. too much of the latter, and something(tm) is up
AGI: I'm not a superintelligence, I'm you.
My im not a witch shirt ...
This isn't an idea that I've heard of until you mentioned it, so it likely hasn't got much purchase in the public consciousness. (Intuitively speaking, a computer which sucks at maths isn't a good computer, let alone AGI material.)
Yeah, I was also just wondering, as obv what I read is not really typical of the average public. Can't think of any place where this idea spread in non-written science fiction for example, with an exception being the predictions of C-3PO, who always seems to be wrong. But he is intended as a comedic sidekick. (him being wrong can also be seen as just the lack of value in calculating odds like that, esp in a universe with The Force).
But yes, not likely to be a big thing indeed.
@Soyweiser @BlueMonday1984 seems plausible tbh