this post was submitted on 06 Aug 2025
15 points (80.0% liked)

Technology

73698 readers
3231 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 7 comments
sorted by: hot top controversial new old
[–] jeena@piefed.jeena.net 13 points 22 hours ago (1 children)

I tried the 20b on my PC with ollama and Open WebUI and I have to say for some of my use cases it preforms similar to the online version, I was impressed.

[–] glimse@lemmy.world 2 points 21 hours ago (1 children)

What card are you running it on?

[–] jeena@piefed.jeena.net 3 points 20 hours ago (1 children)

Nvidia rtx 3060 the 12GB Vram one

[–] glimse@lemmy.world 2 points 20 hours ago (1 children)

Not bad. Does it use all the VRAM?

[–] jeena@piefed.jeena.net 2 points 7 hours ago

I need to check with that model but usually yes.

[–] nymnympseudonym@lemmy.world 6 points 19 hours ago

It's awesome but that 128k context window is a throwback to Llama3 days

I bet the closed $ource model has like 2MB context

[–] thoon@feddit.nl 10 points 21 hours ago

Finally, their company name is starting to make SOME sense