this post was submitted on 29 Jul 2025
29 points (91.4% liked)

Programming

21924 readers
610 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
 

I thought of this recently (anti llm content within)

The reason a lot of companies/people are obsessed with llms and the like, is that it can solve some of their problems (so they think). The thing I noticed, is a LOT of the things they try to force the LLM to fix, could be solved with relatively simple programming.

Things like better searches (seo destroyed this by design, and kagi is about the only usable search engine with easy access), organization (use a database), document management, etc.

People dont fully understand how it all works, so they try to shoehorn the llm to do the work for them (poorly), while learning nothing of value.

you are viewing a single comment's thread
view the rest of the comments
[–] Blue_Morpho@lemmy.world 4 points 4 days ago (1 children)

Maybe the detail you were searching for could not be found, because it did not actually exist.

He said he clicked the source it quoted.

Maybe if Google hasn't been enshittifying search for 10 years, AI search wouldn't be useful. But I've seen the same thing. The forced Gemini summary at the top of Google often has source links that aren't anywhere on the first page of Google itself.

[–] Kolanaki@pawb.social 2 points 4 days ago (1 children)

And how do you know the source is accurate? Having a source doesn't automatically make it accurate. Bullshit can also have sources.

[–] Blue_Morpho@lemmy.world 1 points 4 days ago* (last edited 4 days ago)

The premise of the op is that classic programming makes AI unnecessary. Having a bad source from classic Google search index isn't a problem with AI.