this post was submitted on 14 May 2024
318 points (97.6% liked)

Technology

73939 readers
3381 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Warjac@lemmy.world 144 points 1 year ago (45 children)

Headline fix: Google kills the one good thing it has going for it with AI

[–] lemmyvore@feddit.nl 27 points 1 year ago* (last edited 1 year ago) (4 children)

They don't really have a choice. Classic website search will be useless in the near future because of the rapid rise of LLM-generated pages. Already for some searches 1 out of 3 results is generated crap.

Their only hope it's that somehow they'll be able to weed out LLM pages with LLM. Which is something that scientists say it's impossible because LLMs cannot learn from LLM results so they won't be able to reliably tell which content is good.

The fact they're even trying this shows they're desperate, so they will try.

[–] wagoner 15 points 1 year ago

If they can't direct me to the right web site because they can't tell what's LLM junk, then how will they summarize an answer for me based on those same web sites they know about? It doesn't seem like LLM summaries are a way to avoid that issue at all.

load more comments (3 replies)
load more comments (43 replies)