this post was submitted on 08 Aug 2025
8 points (72.2% liked)

Thoughtful Discussion

343 readers
15 users here now

Welcome

Open discussions and thoughts. Make anything into a discussion!

Leaving a comment explaining why you found a link interesting is optional, but encouraged!

Rules

  1. Follow the rules of discuss.online
  2. No porn
  3. No self-promotion
  4. Don't downvote because you disagree. Doing so repeatedly may result in a ban

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] m_f@discuss.online -3 points 2 days ago (1 children)

It's going to be tough to explore this through internet comments, but that just raises the question of "what do you mean by thought and intelligence?", which then turns into "what do you mean by understanding?" and lots of other similar questions, down a deep rabbit hole. I don't think it's really possible to make strong statements either way until we've come up with a more coherent theory underlying basic terms like that. I'd love to see a rigid and objective definition that we can measure LLMs against.

[–] jet@hackertalks.com 7 points 2 days ago (1 children)

LLMs generate tokens based on probabilities - they do not create thoughts that they can perform discrete logic with.

The chat bots are deceptive because you can ask questions with discrete logic requirements and they answer convincingly well, but that is because their training data set had many such questions in it, so its really token generation.

If you never played with a old school "chat with eliza" bot, its worth the effort. LLMs are just that super charged, there has to be some input to train on to make the response.

Of course people are trying to glue math and discrete algebraic systems on top of LLM output, but that still does not solve the problem of artificial general intelligence.

[–] m_f@discuss.online 2 points 1 day ago (1 children)

Why don't they "create thoughts"? I mentioned this in another comment, but most discussions around AI are people talking past each other because they use the same words to mean different things.

It might seem absurd, but it's a lot harder to define words like "thought" than you'd think, because often the definition just leads to more questions. Wikipedia for example says "In their most common sense, they are understood as conscious processes that can happen independently of sensory stimulation.", but then what does "conscious" mean? Until we have a rigid definition for words like that all the way down to first principles, I wouldn't agree with definitive statements.

ELIZA is fundamentally different from an LLM though, it's much more an expert system.

[–] jet@hackertalks.com 2 points 1 day ago

I see what your doing, but your asking for too much formalism in a casual context. To satisfy the entire vocabulary from first principles would be a non-trivial task - its so daunting I don't even want to attempt it here.