this post was submitted on 14 May 2025
190 points (99.0% liked)
Futurology
3090 readers
16 users here now
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Run your LLMs locally if you really want a therapist, you don’t need any of the extra crap the company offer in their online versions.
Can I run anything on a 3090 or I need a beefier gpu
That's the best gpu for this usecase. Maybe use 2x if you want the best of the best.