this post was submitted on 29 Aug 2025
535 points (98.5% liked)

Technology

74585 readers
3891 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] BootLoop@sh.itjust.works 5 points 10 hours ago (5 children)

LLMs, with a little coaxing, perform well at returning well formed JSON.

[–] Khanzarate@lemmy.world 10 points 10 hours ago (4 children)

They do, my concern is more about if that JSON is correct, not just well-formed.

Also, 18000 waters might be correct JSON, but makes an AI a bad cashier.

[–] staph@sopuli.xyz 6 points 9 hours ago* (last edited 9 hours ago) (3 children)

There is a lot more that goes into it than just being correct. 18000 waters may have been the actual order, because somebody decided to screw with the machine. A human who isn't terminally autistic would reliably interpret that as a joke and would simply refuse to punch that in. The LLM will likely do what a human tells it to do, since it has no contextual awareness, it only has the system prompt and whatever interaction with the user it had so far.

[–] tomiant@programming.dev 1 points 6 hours ago* (last edited 6 hours ago) (1 children)

So they just trim the instructions so it doesn't take joke orders, so it can make more reasonable decisions, like:

"May I take your order?"

"Two double whoppers with extra mayo and a chocolate cherry banana sundae"

"Oh you've GOTTA be joking!"

[–] staph@sopuli.xyz 2 points 6 hours ago

It's trivial to get LLMs to act against the instructions

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)