Soyweiser

joined 2 years ago
[–] Soyweiser@awful.systems 7 points 7 months ago

Well they think the world is anti racist now. The hierarchy is the wrong way around. (Have not done a search for 'the progressive stack' yet but likely it has come up)

[–] Soyweiser@awful.systems 6 points 7 months ago* (last edited 7 months ago) (2 children)

The unicode stuff amazes me as that is one of the things. That could actually be filtered for. Not doing any input validation. It isnt low hanging fruit, it is already on the floor. The incompetence..

[–] Soyweiser@awful.systems 9 points 7 months ago

Not a sneer, but there is this yt'er called the Elephant Graveyard (who I know nothing about apart from these vids) who did a three part series on Joe Rogan, the downfall of comedy, hyperreality, which is weirdly relevant, esp part 3 where suddenly there are some surprise visits.

Part 1: https://www.youtube.com/watch?v=7EuKibmlll4

Part 2: https://www.youtube.com/watch?v=_v3KiaAjpY8

Part 3: https://www.youtube.com/watch?v=ewvRS3NwIlQ

[–] Soyweiser@awful.systems 3 points 7 months ago

if I cant see it, it isn't real. I don't even deny quantum, I deny you, and myself if no mirror is near.

[–] Soyweiser@awful.systems 6 points 7 months ago* (last edited 7 months ago)

My copy of "the singularity is near" also does that btw.

(E: Still looking to confirm that this isn't just my copy, or it if is common, but when I'm in a library I never think to look for the book, and I don't think I have ever seen the book anywhere anyway. It is the 'our sole responsibility...' quote, no idea which page, but it was early on in the book. 'Yudnowsky').

Image and transcript

Transcript: Our sole responsibility is to produce something smarter than we are; any problems beyond that are not ours to solve....[T]here are no hard problems, only problems that are hard to a certain level of intelligence. Move the smallest bit upwards [in level of intelligence], and some problems will suddenly move from "impossible" to "obvious." Move a substantial degree upwards and all of them will become obvious.

—ELIEZER S. YUDNOWSKY, STARING INTO THE SINGULARITY, 1996

Transcript end.

How little has changed, he has always believed intelligence is magic. Also lol on the 'smallest bit'. Not totally fair to sneer at this as he wrote this when he was 17, but oof being quoted in a book like this will not have been good for Yudkowskys ego.

[–] Soyweiser@awful.systems 10 points 7 months ago

https://bsky.app/profile/robertdownen.bsky.social/post/3lwwntxygqc2w Thiel doing a neo-nazi thing. For people keeping score.

[–] Soyweiser@awful.systems 6 points 7 months ago* (last edited 7 months ago) (1 children)

Interesting wondering if they manage to come further in the process than our gov, which seems to restart the process every few years, and then either discovers nobody wants to do it (it being building bigger reactors, not the smaller ones, which iirc from a post here are not likely two work out) for a reasonable price, or the gov falls again over their lies about foreigners and we restart the whole voting cycle again. (It is getting really crazy, our fused green/labour party is now being called the dumbest stuff by the big rightwing liberal party (who are not openly far right, just courting it a lot)).

29 okt are our new elections. Lets see what the ratio between formation and actually ruling is going to be this time. (Last time it took 223 days for a cabinet to form, and from my calculations they ruled for only 336 days).

[–] Soyweiser@awful.systems 1 points 7 months ago

You are correct, I'm just thinking they are going to push quantum like the next big thing to drive up stock prices/investments and use it to restart the hopes for AGI. (the LLM method didn't work, lets talk about quantum and hope that will eventually give us something to latch more capabilities, hope and stock hype on). Just to put my own comment into perspective.

[–] Soyweiser@awful.systems 6 points 7 months ago

Proof that we live in the bad place.

[–] Soyweiser@awful.systems 7 points 7 months ago* (last edited 7 months ago) (2 children)

I think they will just start to make up capabilities, also with the added capabilities of quantum of a computing paradigm, AGI is back on the menu. Now, due to quantum without all the expensive datacenters and problems. We are gonna put quantum in glasses! VR/Augmented reality quantum AI glasses!

[–] Soyweiser@awful.systems 11 points 7 months ago* (last edited 7 months ago) (1 children)

So, as I have been on a cult comparison kick lately, how did it work for those doomsday cults when the world didn't end, and they picked a new date, did they become more radicalized or less? (I'm not sure myself, I'd assume it would be the people disappointed leave, and the rest get worse).

E: ah: https://slate.com/technology/2011/05/apocalypse-2011-what-happens-to-a-doomsday-cult-when-the-world-doesn-t-end.html

... prophecies, per se, almost never fail. They are instead component parts of a complex and interwoven belief system which tends to be very resilient to challenge from outsiders. While the rest of us might focus on the accuracy of an isolated claim as a test of a group’s legitimacy, those who are part of that group—and already accept its whole theology—may not be troubled by what seems to them like a minor mismatch. A few people might abandon the group, typically the newest or least-committed adherents, but the vast majority experience little cognitive dissonance and so make only minor adjustments to their beliefs. They carry on, often feeling more spiritually enriched as a result.

[–] Soyweiser@awful.systems 9 points 7 months ago

Surely they have proof for the already increased capabilities of coding. Because increased capabilities is quite something to claim. It isn't just productivity, but capabilities. Can they put a line on the graph where capabilities reach the 'can solve the knapsack problem correctly and fast' bit?

view more: ‹ prev next ›