this post was submitted on 29 Nov 2023
17 points (79.3% liked)
InternetIsBeautiful
3402 readers
1 users here now
A place for your preferably unique useful or fun sites and kind of a bookmark manager for me :p
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
These LLMs, have no concept of truth or logic. Basically, they automate generating statistically likely bullshit. They sound smart only because the statistical likelihood of smart sounding words and sentences is higher due to the training process. It’s like Hinton said, We have automated crap generation.
Fundamental difference between humans and LLMs is that when humans have a question we look for answers, either through searching books or internet or by experimenting, as in research. But LLMs literally generate answers without verifying it’s authenticity because truth is not a concept built into them.