diz

joined 2 years ago
[–] diz@awful.systems 6 points 3 weeks ago* (last edited 3 weeks ago) (4 children)

Yeah a new form of apologism that I started seeing online is “this isn’t a bubble! Nobody expects an AGI, its just Sam Altman, it will all pay off nicely from 20 million software developers worldwide spending a few grand a year each”.

Which is next level idiotic, besides the numbers just not adding up. There’s only so much open source to plagiarize. It is a very niche activity! It’ll plateau and then a few months later tiny single GPU models catch up to this river boiling shit.

The answer to that has always been the singularity bullshit where the biggest models just keep staying ahead by such a large factor nobody uses the small ones.

[–] diz@awful.systems 15 points 3 weeks ago* (last edited 3 weeks ago) (10 children)

Lol I literally told these folks, something like 15 years ago, that paying to elevate a random nobody like Yudkowsky as the premier “ai risk” researcher, in so much that there is any AI risk, would only increase it.

Boy did I end up more right on that than my most extreme imagination. All the moron has accomplished in life was helping these guys raise cash due to all his hype about how powerful the AI would be.

The billionaires who listened are spending hundreds of billions of dollars - soon to be trillions, if not already - on trying to prove Yudkowsky right by having an AI kill everyone. They literally tout “our product might kill everyone, idk” to raise even more cash. The only saving grace is that it is dumb as fuck and will only make the world a slightly worse place.

[–] diz@awful.systems 3 points 3 weeks ago* (last edited 3 weeks ago)

To be entirely honest I don’t even like the arguments against EDT.

Smoking lesion is hilarious. So theres a lesion that is making people smoke. It is also giving them cancer in some unrelated way which we don’t know, trust me bro. Please bro don’t leave this decision to the lesion, you gotta decide to smoke, it would be irrational to decide not to smoke if the lesion’s gonna make you smoke. Correlation is not causation, gotta smoke, bro.

Obviously in that dumb ass hypothetical, the conditional probability is conditional on the decision, not on the lesion, and the smoking in cancer cases is conditional on the lesion, not on the decision. If those two were indistinguishable then the right decision would be not to smoke. And more generally, adopting causal models without statistical data to back them up is called “being gullible”.

The tobacco companies actually did manufacture the data, too, thats where “type-A personality” comes from.

[–] diz@awful.systems 5 points 4 weeks ago* (last edited 4 weeks ago) (3 children)

Tbh whenever I try to read anything on decision theory (even written by people other than rationalists), I end up wondering how do they think a redundant autopilot (with majority vote) would ever work. In an airplane, that is.

Considering just the physical consequences of a decision doesn’t work (unless theres a fault, consequences don’t make it through the voting electronics, so the alternative decisions made for the alternative that there is no fault, never make it through).

Each one simulating the two or more other autopilots is scifi-brained idiocy. Requiring that autopilots are exact copies is stupid (what if we had two different teams write different implementations, I think Airbus actually sort if did that).

Nothing is going to be simulating anything, and to make matters even worse for philosophers amateur and academic alike, the whole reason for redundancy is that sometimes there is a glitch that makes them not compute the same values, so any attempt to be clever with “ha, we just treat copies as one thing” doesn’t cut it either.

[–] diz@awful.systems 7 points 4 weeks ago* (last edited 4 weeks ago) (1 children)

Even to the extent that they are "prompting it wrong" it's still on the AI companies for calling this shit "AI". LLMs fundamentally do not even attempt to do cognitive work (the way a chess engine does by iterating over possible moves).

Also, LLM tools do not exist. All you can get is a sales demo for the company stock (the actual product being sold), built to impress how close to AGI the company is. You have to creatively misuse these things to get any value out of them.

The closest they get to tools is "AI coding", but even then, these things plagiarize code you don't even want plagiarized (because its MIT licensed and you'd rather keep up with upstream fixes).

[–] diz@awful.systems 5 points 1 month ago* (last edited 1 month ago)

But just hear me out: if you delete your old emails, you won’t be roped into paying for extra space, and Microsoft or Google will have a little less money to buy water with!

Switch to Linux and avoid using any Microsoft products to conserve even more water.

[–] diz@awful.systems 3 points 1 month ago* (last edited 1 month ago)

Well yeah but the new age ones overthink everything. Edit: I suspect you could probably find one of them spelling it out.

[–] diz@awful.systems 15 points 1 month ago (6 children)

The problem is that to start breaking encryption you need quantum computing with a bunch of qubits as originally defined and not "our lawyer signed off on the claim that we have 1000 qubits".

[–] diz@awful.systems 4 points 1 month ago

I wonder if the weird tags are even strictly necessary, or if a sufficiently strongly worded and repetitive message would suffice.

[–] diz@awful.systems 4 points 1 month ago* (last edited 1 month ago) (2 children)

Embryo selection may just be the eugenicist's equivalent of greenwashing.

Eugenicists doing IVF is kind of funny, since it is a procedure that circumvents natural selection quite a bit, especially for the guys. It's what, something like billion to one for the sperm?

If they're doing IVF, being into eugenics, they need someone to tell them that they aren't "worsening the species", and the embryo selection provides just that.

edit: The worse part would be if people who don't need IVF start doing IVF with embryo selection, expecting some sort of benefit for the offspring. With American tendency to sell people unnecessary treatments and procedures, I can totally see that happening.

[–] diz@awful.systems 6 points 1 month ago* (last edited 1 month ago)

I think I have a real example. Non hierarchical (or, at least, less hierarchical) arrangements. Anarchy is equated with chaos.

Anything in nature we ascribe a hierarchy to; ants or other hymenoptera and termites have supposed "queens", parent wolves are "alphas" and so on. Fictional ant-like aliens have brain bugs, or cerebrates, or the like. Even the fucking zombies infected with a variant of the rabies virus get alphas somehow.

Every effort has went into twisting every view on reality and every fiction to align with the ideology.

[–] diz@awful.systems 9 points 1 month ago* (last edited 1 month ago)

I think it's a mixture of it being cosplay and these folks being extreme believers in capitalism, in the inevitability of it and impossibility of any alternative. They are all successful grifters, and they didn't get there through some scheming and clever deception, they got there through sincere beliefs that aligned with the party line.

They don't believe that anything can actually be done about this progression towards doom, just as much as they don't properly believe in the doom.

view more: ‹ prev next ›