this post was submitted on 01 Nov 2025
8 points (83.3% liked)

Pulse of Truth

1701 readers
139 users here now

Cyber Security news and links to cyber security stories that could make you go hmmm. The content is exactly as it is consumed through RSS feeds and wont be edited (except for the occasional encoding errors).

This community is automagically fed by an instance of Dittybopper.

founded 2 years ago
MODERATORS
 

AI researchers at Andon Labs embedded various LLMs in a vacuum robot to test how ready they were to be embodied. And hilarity ensued.

you are viewing a single comment's thread
view the rest of the comments
[–] pennomi@lemmy.world 5 points 3 days ago (1 children)

Well yeah, LLMs, unlike bodies, aren’t punished for doing stupid things.

If it spins around in circles arguing with itself, it has still accomplished its purpose - to generate text. But a real body penalizes you for wasting time when looking for food.

[–] nymnympseudonym@piefed.social 1 points 3 days ago (1 children)

Yes they are punished have you not heard of backprop?

[–] pennomi@lemmy.world 3 points 3 days ago

Sure but that’s not equivalent to muscular expenditure - every action costs energy so animals learn to be efficient with their movements and thoughts. Also, an LLM cannot re-adjust its weights in realtime like a brain can.