this post was submitted on 23 Sep 2023
17 points (94.7% liked)
Futurology
3204 readers
55 users here now
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Weird because you can run some LLM AIs locally on even your phone. That’s not a very impressive claim.
Not very well though, the idea is to make them more efficient