Soyweiser

joined 2 years ago
[–] Soyweiser@awful.systems 3 points 21 hours ago* (last edited 21 hours ago)

A hack can also just be a clever way to use a system in a way it wasnt designed.

Say you put a Ring doorbell on a drone as a perimeter defense thing? A hack. See also the woman who makes bad robots.

It also can be a certain playfulness with tech. Which is why hacker is dead. It cannot survive contact with capitalist forces.

[–] Soyweiser@awful.systems 11 points 1 day ago* (last edited 21 hours ago)

“Why didn’t Iain take my neuroses into account??”

Yes, why didn't he take the neuroses of normal people into account. Normal people who spend 90% of their day worrying about the acausalrobotgod killing everybody.

Strikes me as they simply have never talked to normal people about immortality like that, even in a post scarcity world, lot of people simply don't feel like it would be worth their time to live forever in that.

Edit:

But one of my hobbies is “oppositional reading” – deliberately interpreting novels counter to the obvious / intended reading. And it’s not so clear to me that the Culture is all it is cracked up to be.

This isn't oppositional reading. This is an often discussed thing in the novels. So much so that the novels have counterarguments for a lot of the regular 'the culture is bad' arguments.

Anyway the article is so bad I wonder how well this person can read.

Edit part 2, can't let things go shouting electronic booo:

Sociopaths

They mention that sociopaths either get a 24/7 drone on them to guard them if they do crimes (more a general criminal thing) or if they are more megalomanic sociopaths they get to run out their desires in a virtual world (Which I assume runs a lot like the modern game Rust, where a subset of the playerbase seems to love to make 14 year old boys cry, going from the yt vids I saw). If this isn't enough, they will need to convince a Mind to help them. Because all large machines in The Culture are intelligent. Good luck with that. Also The Culture is not something like the glitter belt of Revelation Space, where somebody can sign a contract to give away their voting rights or something. So the power of a sociopath is already limited.

not solved alignment

They both have, and it doesn't matter. They have because any mind-equiv mind who goes mad gets destroyed (they literally need to take care of large group of humans or go mad, they have symbiotic relationship with humanity), or if they try to go foom they sublime, in the culture universe, sublimation is inevitable. (Therein also lies the real distopian part, considering sublimation is seen as so amazing that keeping a whole culture away from proven heaven seems like an angle to take, then the deathism also would be an argument, but more like that they let people die without going to heaven (but again, this is not subtext, culture not going poof is seen as very weird, only question is why a Yud-equiv mind doesn't come back to uplift the physical universe)).

A manipulated population

Not subtext, but simply text. Often criticized in various ways. But also a lot of behavior outside of the norm is tolerated, see the lava boat ride (where the only person not tolerated is the one having a 'this is a simulation' break). Or the guy just building a cable system.

Not mentioned:

consent

(This is also why humans are not pets).

[–] Soyweiser@awful.systems 8 points 1 day ago* (last edited 21 hours ago)

It also comes from a mall cop (a very USA sort of concept) who was extremely afraid of getting shot at his job (more so than regular cops at the time) and who overreacted massively and wanted all kinds of weird gun attachments iirc. Sadly this paranoia is something that the US cops also suffer from now. Causing everybody to suffer.

E: wow I had misremembered how crazy the story was.

[–] Soyweiser@awful.systems 9 points 1 day ago* (last edited 1 day ago)

Hackers is dead. (Apologies to punk)

Id say that for one reason alone, when Musk claimed grok was from the guide nobody really turned on him.

Unrelated to programmers or hackers, Elons father (CW: racism) went fully mask off and claims Elon agrees with him. Which considering his promotion of the UK racists does not feel off the mark. (And he is spreading the dumb '[Africans] have an [average] IQ of 63' shit, and claims it is all genetic. Sure man, the average African needs help understanding the business end of a hammer. As I said before, guess I met the smartest Africans in the world then, as my university had a few smart exchange students from an African country. If you look at his statements it is even dumber than normal, as he says population, so that means either non-Black Africans are not included, showing just how much he thinks of himself as the other, or they are, and the Black African average is even lower).

[–] Soyweiser@awful.systems 5 points 1 day ago (1 children)

Grok, is this ethical?

[–] Soyweiser@awful.systems 1 points 2 days ago

For a snicker I looked it up: https://iqcomparisonsite.com/iqtable.aspx

One in 100 million. So he would be in the top 80 smartest people alive right now. Which includes third world, children, elderly etc.

[–] Soyweiser@awful.systems 2 points 2 days ago (1 children)

That was in relation to nanotech right? Or am I confusing articles here.

[–] Soyweiser@awful.systems 1 points 2 days ago

Another reason I archived it tbh. Not happy with that. And normally a reason for me to not spread or read it at all.

[–] Soyweiser@awful.systems 3 points 2 days ago (2 children)

The normal max if an iq test is ~160 and from what I can tell nobody tests above it basically because it is not relevant. (And I assume testing problems and variance become to big statistical problems at this level). Not even sure how rare a 190 iq would be statistically, prob laughably rare.

[–] Soyweiser@awful.systems 4 points 3 days ago (1 children)

New trick for detecting bots, ask them for the seahorse emoji. found via

[–] Soyweiser@awful.systems 6 points 3 days ago

Reality has an anti-robot bias.

 

Via reddits sneerclub. Thanks u/aiworldism.

I have called LW a cult incubator for a while now, and while the term has not catched on, nice to see more reporting on the problem that lw makes you more likely to join a cult.

https://www.aipanic.news/p/the-rationality-trap the original link for the people who dont like archive.is used the archive because I dont like substack and want to discourage its use.

[–] Soyweiser@awful.systems 3 points 3 days ago

As so often happens, this downplays the extent to which the batshit was within them all along.

Yep, i mean the whole thing started because an economics blogger convinced Yud to start his own site, and that blogger is the source of the great filter idea. (Or at least the farming, the idea precedes him). It started with Robin Hanson. This fear of being destroyed before we take our place in the stars has been in the LW community from the start.

 

As found by @gerikson here, more from the anti anti TESCREAL crowd. How the antis are actually R9PRESENTATIONALism. Ottokar expanded on their idea in a blog post.

Original link.

I have not read the bigger blog post yet btw, just assumed it would be sneerable and posted it here for everyone's amusement. Learn about your own true motives today. (This could be a troll of course, boy does he drop a lot of names and thinks that is enough to link things).

E: alternative title: Ideological Turing Test, a critical failure

 

Original title 'What we talk about when we talk about risk'. article explains medical risk and why the polygenic embryo selection people think about it the wrong way. Includes a mention of one of our Scotts (you know the one). Non archived link: https://theinfinitesimal.substack.com/p/what-we-talk-about-when-we-talk-about

11
submitted 4 months ago* (last edited 4 months ago) by Soyweiser@awful.systems to c/sneerclub@awful.systems
 

Begrudgingly Yeast (@begrudginglyyeast.bsky.social) on bsky informed me that I should read this short story called 'Death and the Gorgon' by Greg Egan as he has a good handle on the subjects/subjects we talk about. We have talked about Greg before on Reddit.

I was glad I did, so going to suggest that more people he do it. The only complaint you can have is that it gives no real 'steelman' airtime to the subjects/subjects it is being negative about. But well, he doesn't have to, he isn't the guardian. Anyway, not going to spoil it, best to just give it a read.

And if you are wondering, did the lesswrongers also read it? Of course: https://www.lesswrong.com/posts/hx5EkHFH5hGzngZDs/comment-on-death-and-the-gorgon (Warning, spoilers for the story)

(Note im not sure this pdf was intended to be public, I did find it on google, but might not be meant to be accessible this way).

 

The interview itself

Got the interview via Dr. Émile P. Torres on twitter

Somebody else sneered: 'Makings of some fantastic sitcom skits here.

"No, I can't wash the skidmarks out of my knickers, love. I'm too busy getting some incredibly high EV worrying done about the Basilisk. Can't you wash them?"

https://mathbabe.org/2024/03/16/an-interview-with-someone-who-left-effective-altruism/

 

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

view more: next ›