this post was submitted on 05 Nov 2025
106 points (96.5% liked)
Fuck AI
4512 readers
1288 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't think people realize what's going to quickly happen is that people making the models will start extorting brands to get a better ranking. You want our model to recommend your brand then pay us $X. Then any perceived utility about reading reviews vanishes kind of like with fake reviews today.
Most AI company executives have already spoken openly about how that's their plan for future financial growth: advertisements delivered naturally in the output with no clear division between ads and the content.
Oh 100%, we already see what fElon programmed HitlerBot to do. It's going to be an ultra-capitalists wet dream once the internet is destroyed and people only have access to Corpo LLM for only the cost of 3 pints of blood a month!
Arguably this is already happening. AIs are trained mostly by web scraping and specifically scraping Reddit which has a known astroturfing problem. So it's already being fed non-genuine inputs and isn't likely isn't being used with tools to flag reviews as fake
Already happening. I was using chatgpt to make a script to download my YouTube music liked videos and it's keep giving me a pop up with the message "use spotify instead"
Worse than that people and brands are going to enshitify the internet In an effort to get their products and brands into the training data with a more positive context.
Just use one AI to create hundreds of thousands of pages of bullshit about how great your brand is and how terrible your competitors brands are.
Then every AI scraping those random pages trying to harvest as much data as possible folds that into their training data set. And it doesn't just have to be things like fake product reviews. Fake peer-reviewed studies and fake white papers. It doesn't even have to be on the surface. It can be buried in a 1000 web servers accessible to scrapers but not to typical users.
Then all the other brands will have to do the same to compete. All of this enshitifying the models themselves more and more as they go.
Self-inflicted digital brain tumors.
This isn't so dystopian if open-weight LLMs keep their momentum. Hosts for models become commodities, not brands to capture users, if anyone can host them, and it gives each host less power for extortion.
You seem to imply that they would care about that at all. They won't. It's already happening in the shops they frequent (Amazon) and they don't care.
Reports exist that there are already software companies building features that are hallucinated by AI because people search and demand them.