this post was submitted on 14 May 2025
190 points (99.0% liked)
Futurology
3133 readers
8 users here now
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Run your LLMs locally if you really want a therapist, you don’t need any of the extra crap the company offer in their online versions.
Can I run anything on a 3090 or I need a beefier gpu
I think I'm at a 3060 or so and it works decently depending on the model. I can generally get away with around 13B, or some 20+ Q4 or so but they get real slow by that point.
It's a lot of messing around to find something that performs decent while not being so limited as to get crazy repetitive or saying loony things.