this post was submitted on 23 Sep 2023
17 points (94.7% liked)

Futurology

3204 readers
55 users here now

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pennomi@lemmy.world 6 points 2 years ago (1 children)

Weird because you can run some LLM AIs locally on even your phone. That’s not a very impressive claim.

[–] morrowind@lemmy.ml 1 points 2 years ago

Not very well though, the idea is to make them more efficient