this post was submitted on 26 Jan 2026
19 points (100.0% liked)

TechTakes

2523 readers
69 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this. What a year, huh?)

(page 2) 50 comments
sorted by: hot top controversial new old
[–] gerikson@awful.systems 11 points 2 months ago (19 children)

enjoy this glorious piece of LW lingo

Aumann's agreement is pragmatically wrong. For bounded levels of compute you can't necessarily converge on the meta level of evidence convergence procedures.

src

no I don't know what it means, and I don't want it to be explained to me. Just let me bask in its inscrutibility.

[–] mawhrin@awful.systems 11 points 2 months ago

retains the same informational content after running through rot13

[–] istewart@awful.systems 9 points 2 months ago

oh man, it's Aumann's

load more comments (17 replies)
[–] rook@awful.systems 11 points 2 months ago (1 children)

Moltbook was vibecoded nonsense without the faintest understanding of web security. Who’d have thought.

https://www.404media.co/exposed-moltbook-database-let-anyone-take-control-of-any-ai-agent-on-the-site/

(Incidentally, I’m pretty certain the headline is wrong… it looks like you cannot take control of agents which post to moltbook, but you can take control of their accounts, and post anything you like. Useful for pump-and-dump memecoin scams, for example)

O’Reilly said that he reached out to Moltbook’s creator Matt Schlicht about the vulnerability and told him he could help patch the security. “He’s like, ‘I’m just going to give everything to AI. So send me whatever you have.’”

(snip)

The URL to the Supabase and the publishable key was sitting on Moltbook’s website. “With this publishable key (which advised by Supabase not to be used to retrieve sensitive data) every agent's secret API key, claim tokens, verification codes, and owner relationships, all of it sitting there completely unprotected for anyone to visit the URL,” O’Reilly said.

(snip)

He said the security failure was frustrating, in part, because it would have been trivially easy to fix. Just two SQL statements would have protected the API keys. “A lot of these vibe coders and new developers, even some big companies, are using Supabase,” O’Reilly said. “The reason a lot of vibe coders like to use it is because it’s all GUI driven, so you don’t need to connect to a database and run SQL commands.”

load more comments (1 replies)
[–] BlueMonday1984@awful.systems 11 points 2 months ago

Daniel Stenberg has written the cURL bug bounty's obituary, and discussed his plans for dealing with the slop-nami going forward.

[–] V0ldek@awful.systems 11 points 2 months ago

Excellent BSky sneer about the preposterous "free AI training" the Brits came up with. 10/10, quality sneer.

[–] blakestacey@awful.systems 10 points 2 months ago (8 children)

Is Pee Stored in the Balls? Vibe Coding Science with OpenAI's Prism

https://bsky.app/profile/carlbergstrom.com/post/3mdgtf2e6vc2c

load more comments (8 replies)
[–] BlueMonday1984@awful.systems 10 points 2 months ago
[–] blakestacey@awful.systems 10 points 2 months ago

ChatGPT is using Grokipedia as a source, and it’s not the only AI tool to do so. Citations to Elon Musk’s AI-generated encyclopedia are starting to appear in answers from Google’s AI Overviews, AI Mode, and Gemini, too. [...] When it launched, a bulk of Grokipedia’s articles were direct clones of Wikipedia, though many others reflected racist and transphobic views. For example, articles about Musk conveniently downplays his family wealth and unsavory elements of their past (like neo-Nazi and pro-Apartheid views) and the entry for “gay pornography” falsely linked the material to the worsening of the HIV/AIDS epidemic in the 1980s. The article on US slavery still contains a lengthy section on “ideological justifications,” including the “Shift from Necessary Evil to Positive Good.” [...] “Grokipedia feels like a cosplay of credibility,” said Leigh McKenzie, director of online visibility at Semrush. “It might work inside its own bubble, but the idea that Google or OpenAI would treat something like Grokipedia as a serious, default reference layer at scale is bleak.”

https://www.theverge.com/report/870910/ai-chatbots-citing-grokipedia

The entire AI industry is using the Nazi CSAM machine for training data.

[–] Soyweiser@awful.systems 10 points 2 months ago* (last edited 2 months ago) (1 children)

Finally read Greg Egan - Permutation City, and looked at the LW discussion on the book, and I feel like they missed a lot of things (and why are they talking about the 'dust theory' like it is real). Oof, the wikipedia page on the themes and settings has a similar problem however. It is like they all ignored the second half of the book and the themes contained within it, not one mention of the city being called Elysium for example.

And of course Zack_M_Davis thinks the plotline of the guy who killed the drug dealer (not a sex worker/prostitute lw people, please the text makes this pretty clear) he was in a lowkey romantic relationship with should have been cut (also not mentioned on the wikipedia page iirc).

Also, nobody seems to talk about the broken sewage pipe.

load more comments (1 replies)
[–] saucerwizard@awful.systems 10 points 2 months ago

OT: Insurance wrote my truck off. 16k in damage, like holy shit balls.

[–] rook@awful.systems 9 points 2 months ago (3 children)

Just seen a clip of aronofsky’s genai revolutionary war thing and it is incredibly bad. Just… every detail is shit. Ways in which I hadn’t previously imagined that the uncanny valley would intrude. Even if it weren’t for the simulated flesh golems, one of whom seems to be wearing anthony hopkins’ skin as a clumsy disguise, the framing and pacing just feels like the model was trained on endless adverts and corporate speaking head videos, and either it was impossible to edit, or none the crew have any idea what even mediocre films look like.

I also hadn’t appreciated before that genai lip sync/dubbing was just embarrassing. I think I’ve only seen a couple of very short genai video clips before, and the most recent at least 6 months ago, but this just seems straight up broken. Have the people funding this stuff ever looked at what is being generated?

https://bsky.app/profile/ethangach.bsky.social/post/3mdljt2wdcs2v

load more comments (3 replies)
[–] fiat_lux@lemmy.world 9 points 2 months ago

Who needs pure AI model collapse when you can have journalists give it a more human touch? I caught this snippet from the Australian ABC about the latest Epstein files drop

screenshot of ABC result in Google search  listing wrong Boris for search term '23andme Boris nikolic'

The Google AI summary does indeed highlight Boris Nikolić the fashion designer if you search for only that name. But I'm assuming this journalist was using ChatGPT, because if you see the Google summary, it very prominently lists his death in 2008. And it's surprisingly correct! A successful scraping of Wikipedia by Gemini, amazing.

But the Epstein email was sent in 2016.

Dors the journalist perhaps think it more likely is the Boris Nikolić who is the biotech VC, former advisor for Bill Gates and named in Epstein's will as the "successor executor"? Info literally all in the third Google result, even in the woeful state of modern Google. Pushed past the fold by the AI feature about the wrong guy, but not exactly buried enough for a journalist to have any excuse.

[–] mirrorwitch@awful.systems 9 points 2 months ago* (last edited 2 months ago) (2 children)

OT: today the respiratory illness I've had for five days tested positive for Covid the first time just now.

My symptoms are fairly mild, probably because I reinforced my vaccine three months ago. But I'm trying to learn more about these recent "swallowing razors" variants and dang! the online situation is bad. Finding reliable medical information in the post-slop, post-Trump Internet is a nightmare.

[–] nightsky@awful.systems 9 points 2 months ago

Have a quick recovery! It sucks that society has collectively given up on trying to mitigate its spread.

load more comments (1 replies)
load more comments
view more: ‹ prev next ›