scruiser

joined 2 years ago
[–] scruiser@awful.systems 5 points 1 month ago

It is pretty good as a source for science fiction ideas. I mean, lots of their ideas originate from science fiction, but their original ideas would make fun fantasy sci-fi concepts. Like looking off their current front page... https://www.lesswrong.com/posts/WLFRkm3PhJ3Ty27QH/the-cats-are-on-to-something cat's deliberately latching on to humans as the most lazy way of advancing their own value across the future seems like a solid point of fantasy worldworldbuilding...

[–] scruiser@awful.systems 5 points 1 month ago

To add to blakestacey's answer, his fictional worldbuilding concept, dath ilan (which he treats like rigorous academic work to the point of citing it in tweets), uses prediction markets in basically everything, from setting government policy to healthcare plans to deciding what restaurant to eat at.

[–] scruiser@awful.systems 4 points 1 month ago (2 children)

Every tweet in that thread is sneerable. Either from failing to understand the current scientific process, vastly overestimating how easily cutting edge can be turned into cleanly resolvable predictions, or assuming prediction markets are magic.

[–] scruiser@awful.systems 9 points 1 month ago (1 children)

He's the one that used the phrase "silent gentle rape"? Yeah, he's at least as bad as the worst evo-psych pseudoscience misogyny posted on lesswrong, with the added twist he has a position in academia to lend him more legitimacy.

[–] scruiser@awful.systems 7 points 1 month ago* (last edited 1 month ago) (14 children)

He had me in the first half, I thought he was calling out rationalist's problems (even if dishonestly disassociating himself from then). But then his recommended solution was prediction markets (a concept which rationalists have in fact been trying to play around with, albeit at a toy model level with fake money).

[–] scruiser@awful.systems 8 points 1 month ago* (last edited 1 month ago)

The author occasionally posts to slatestarcodex, we kind of tried to explain what was wrong with Scott Alexander and I think she halfway got it... I also see her around the comments in sneerclub occasionally, so at least she is staying aware of things...

[–] scruiser@awful.systems 11 points 1 month ago (1 children)

~~Poor historical accuracy in favor of meme potential is why our reality is so comically absurd.~~ You can basically use the simulation hypothesis to justify anything you want by proposing some weird motive or goals of the simulators. It almost makes God-of-the-gaps religious arguments seem sane and well-founded by comparison!

[–] scruiser@awful.systems 9 points 1 month ago

Within the world-building of the story, the way the logic is structured makes sense in a ruthless utilitarian way (although Scott's narration and framing is way too sympathetic to the murderously autistic angel that did it), but taken in the context outside the story of the sort of racism Scott likes to promote, yeah it is really bad.

We had previous discussion of Unsong on the old site. (Kind of cringing about the fact that I liked the story at one point and only gradually noticed all the problematic stuff and poor writing quality stuff.)

[–] scruiser@awful.systems 15 points 1 month ago* (last edited 1 month ago) (6 children)

I've seen this concept mixed with the simulation "hypothesis". The logic goes that if future simulators are running a "rescue simulation" but only cared (or at least cared more) about the interesting or more agentic people (i.e. rich/white/westerner/lesswronger), they might only fully simulate those people and leave simpler nonsapient scripts/algorithms piloting the other people (i.e. poor/irrational/foreign people).

So basically literally positing a mechanism by which they are the only real people and other people are literally NPCs.

[–] scruiser@awful.systems 13 points 2 months ago

Chiming in to agree your prediction write-ups aren't particularly good. Sure they spark discussion, but the whole forecasting/prediction game is one we've seen the rationalists play many times, and it is very easy to overlook or at least undercount your misses and over hype your successes.

In general... I think your predictions are too specific and too optimistic...

[–] scruiser@awful.systems 11 points 2 months ago (1 children)

Every time I see a rationalist bring up the term "Moloch" I get a little angrier at Scott Alexander.

[–] scruiser@awful.systems 4 points 2 months ago

I use the term "inspiring" loosely.

view more: ‹ prev next ›