istewart

joined 1 year ago
[–] istewart@awful.systems 7 points 1 month ago (1 children)

There's maybe still a concise social history to be written of how all this crap congealed together. I'm particularly interested in the overlap between the AI doomers, the ancap libertarian weirdos who wanted to nail down their economics as capital-S Science™, and even the online poker grinders of the 2000s who aspired to become statistical-thinking robots. I hesitate to say any of this is undocumented, because the reams of posts are still out there, but a Michael Lewis-style pop history of it all would be a hoot. I understand Elizabeth Sandifer has it all well-covered from the ideological angle, and Adam Becker's new book looks good too, but having something covering it from the forum/feed-poster angle might end up being the epitaph the movement deserves.

[–] istewart@awful.systems 13 points 1 month ago

I would say that the in-group jargon is more of a retention tactic than an attraction tactic, although it can become that for people who are desperately looking for an ordered view of the world. Certainly I've seen it a lot in recovering Scientologists, expressing how that edifice of jargon, colloquialisms, and redefined words shaped their worldview and how they related to other people. In this case here, if you've been nodding along for a while and want to continue to be one of the cool guys, how could you not glomarize? Peek coolly out from beneath your fedora and neither confirm nor deny?

I will agree that the ratsphere has softer boundaries and is not particularly competently managed as a cult. As you allude to, too, there isn't a clear induction ritual or psychological turning point, just a mass of material that you're supposed to absorb and internalize over a necessarily lengthy stretch of time. Hence the most clearly identifiable cults are splinter groups.

[–] istewart@awful.systems 12 points 1 month ago (8 children)

I understand where he probably got the neologism "glomarize" from (https://en.wikipedia.org/wiki/Glomar_Explorer) but his willingness to beat you in the face with it until you accept it is a big part of what makes his writing style so offputting. And, uh, this level of enthusiasm for specialized jargon continues to fail to overcome the cult allegations.

[–] istewart@awful.systems 5 points 1 month ago (1 children)

hmm, was groverhaus actually just an instance of convergent evolution among posters

[–] istewart@awful.systems 10 points 1 month ago (1 children)

Two cases that are a bit less clear, but still show some structural similarities: Peter Thiel, Sam Altman.

A bit less clear because these two have a bit more money to throw around, and damn would it be good if they started throwing some in our direction again

[–] istewart@awful.systems 3 points 1 month ago

Probably ought to apply real bleach should you discover one languishing nonfunctionally in the back of a Goodwill a couple years from now - the form factor invites some unsanitary possibilities (as the below comment has already pointed out)

[–] istewart@awful.systems 10 points 1 month ago (1 children)

"Agentic" is meant to seem sci-fi, but I can't help but think it's terminal business-speak. It's the clearest statement yet of the attempted redesign of the computer from a personal device to a distinct entity separate from oneself. One is no longer a user or administrator, one is instead passively waiting for "agents" to complete a task on one's behalf. This model is imposed from the top down, to be the strongest reinforcement yet of the all-important moat around the big vendors' cloud businesses. Once you're in deep with "agents," your workflows will probably be so hopelessly tangled, vendor-specific, and non-debuggable/non-reimplementable that migrating them to another vendor would be a nightmare task orders of magnitude beyond any database or CRM migration. If your workflows even get any work done anymore at all.

[–] istewart@awful.systems 3 points 1 month ago (1 children)

Oh joy, I can perform a threat display by twirling it around my head like a bolo. I think I will get the pink or bright yellow one

[–] istewart@awful.systems 6 points 1 month ago

ok, cool. when does he start selling off all the super limited-edition anime waifu merch? asking for a friend

[–] istewart@awful.systems 7 points 1 month ago

‘Genetic engineering to merge with machines’ is both a stream of words with negative meaning

Iron-compatible osteoblasts that build bio-steel! Synapses with silicon, no, make that graphene neurotransmitter filters in the gap! C'mon, Sam, hire me and we can technobabble so much harder than this!

I must insist on cash payment, though. No stock options. And I prefer to be paid weekly.

[–] istewart@awful.systems 6 points 1 month ago

Occasionally I feel that Altman may be plugged into something that’s even dumber and more under the radar than vanilla rationalism.

I think he exists in the tension between rationalism/transhumanism and what he can get away with selling to the public, and that necessarily means his schtick appears dumber and more incoherent. He's essentially got two major groups he's trying to manipulate simultaneously; true believers and those who have yet to be persuaded. As he runs out of hype on the public-facing side, it's suddenly a desperate scramble to keep the true believers that make up the bulk of his workforce on board. Hence his pivot to marketing his latest and by far most important product: publicly traded shares in OpenAI.

Apropos of nothing, L. Ron Hubbard died in a dingy trailer in Creston. Ever been to Creston? It's a long ways from Hollywood.

[–] istewart@awful.systems 4 points 1 month ago

you know, if those ASML folks in dutchland weren't quite so busy what with their EUV lasers and all that, we might not be in quite this same pickle right now,

view more: ‹ prev next ›