this post was submitted on 14 May 2025
190 points (99.0% liked)

Futurology

3090 readers
16 users here now

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] errer@lemmy.world 11 points 2 months ago (10 children)

Run your LLMs locally if you really want a therapist, you don’t need any of the extra crap the company offer in their online versions.

[–] bobotron@lemm.ee 1 points 2 months ago (9 children)

Can I run anything on a 3090 or I need a beefier gpu

[–] SchizoDenji@lemm.ee 1 points 2 months ago

That's the best gpu for this usecase. Maybe use 2x if you want the best of the best.

load more comments (8 replies)
load more comments (8 replies)