this post was submitted on 28 Jan 2025
277 points (89.2% liked)

memes

16768 readers
3068 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/Ads/AI SlopNo advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] Mora@pawb.social 1 points 6 months ago (2 children)

As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?๐Ÿค”

[โ€“] hmmm@sh.itjust.works 3 points 6 months ago

You can try from lowest to bigger. You probably can run biggest too but it will be slow.

[โ€“] kyoji@lemmy.world 2 points 6 months ago

I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think