this post was submitted on 21 Feb 2025
73 points (91.0% liked)

Cybersecurity

8031 readers
197 users here now

c/cybersecurity is a community centered on the cybersecurity and information security profession. You can come here to discuss news, post something interesting, or just chat with others.

THE RULES

Instance Rules

Community Rules

If you ask someone to hack your "friends" socials you're just going to get banned so don't do that.

Learn about hacking

Hack the Box

Try Hack Me

Pico Capture the flag

Other security-related communities !databreaches@lemmy.zip !netsec@lemmy.world !securitynews@infosec.pub !cybersecurity@infosec.pub !pulse_of_truth@infosec.pub

Notable mention to !cybersecuritymemes@lemmy.world

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] horse_battery_staple@lemmy.world 0 points 5 months ago (1 children)

I use 32b and the 672b side by side. The performance hit is around 20% and I keep all my data local. I am not conflating the two however self hosting works for me just fine. Your usecase is your own certainly. However I'd rather take the performance hit for the added data privacy.

Also it's nice to he able to set my own weights and further distil R1

I have a local python expert a local golang expert and both have my local gitlab repository and I've tied their respective Ollama keys to my VSCode IDE.

[–] brucethemoose@lemmy.world 2 points 5 months ago* (last edited 5 months ago) (1 children)

Depends for sure. I usually try the 32B first, but give really "hard" queries to some API model.

[–] horse_battery_staple@lemmy.world 2 points 5 months ago

With the distilled models I have, I've been able to build and troubleshoot pretty complicated apps in Golang and Python. However, these distilled models are very specialized and will not do things like write me a story about a duck made out of duct tape or properly summarize articles. There are absolutely limits to my workflow and setup. But I'm pretty happy with it.