I tried the 20b on my PC with ollama and Open WebUI and I have to say for some of my use cases it preforms similar to the online version, I was impressed.
this post was submitted on 06 Aug 2025
15 points (80.0% liked)
Technology
73698 readers
3231 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
What card are you running it on?
Nvidia rtx 3060 the 12GB Vram one
Not bad. Does it use all the VRAM?
I need to check with that model but usually yes.
It's awesome but that 128k context window is a throwback to Llama3 days
I bet the closed $ource model has like 2MB context
Finally, their company name is starting to make SOME sense