this post was submitted on 17 Sep 2025
89 points (96.8% liked)

Fuck AI

4038 readers
525 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ignirtoq@fedia.io 7 points 4 hours ago (1 children)

"Users accustomed to receiving confident answers to virtually any question would likely abandon such systems rapidly," the researcher wrote.

While there are "established methods for quantifying uncertainty," AI models could end up requiring "significantly more computation than today’s approach," he argued, "as they must evaluate multiple possible responses and estimate confidence levels."

"For a system processing millions of queries daily, this translates to dramatically higher operational costs," Xing wrote.

  1. They already require substantially more computation than search engines.
  2. They already cost substantially more than search engines.
  3. Their hallucinations make them unusable for any application beyond novelty.

If removing hallucinations means Joe Shmoe isn't interested in asking it questions a search engine could already answer, but it brings even 1% of the capability promised by all the hype, they would finally actually have a product. The good long-term business move is absolutely to remove hallucinations and add uncertainty. Let's see if any of then actually do it.

[–] DreamlandLividity@lemmy.world 3 points 3 hours ago

They probably would if they could. But removing hallucinations would remove the entire AI. The AI is not capable of anything other than hallucinations that are sometimes correct. They also can't give confidence, because that would be hallucinated too.