this post was submitted on 15 Nov 2025
657 points (95.8% liked)

Microblog Memes

9898 readers
2251 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Lazycog@sopuli.xyz 72 points 1 month ago (4 children)

Where can I find open source friends? I'd like to compile them myself

[–] new_guy@lemmy.world 26 points 1 month ago

Calm down Dr. Frankenstein

[–] lena@gregtech.eu 22 points 1 month ago (2 children)
[–] SkunkWorkz@lemmy.world 7 points 1 month ago

And pull them too.

[–] boonhet@sopuli.xyz 4 points 1 month ago

I've forked a person. The fork is certainly much more agreeable than the original that I've since abandoned completely, but being a solo maintainer on this project is pretty rough.

[–] wreckedcarzz@lemmy.world 7 points 1 month ago (1 children)

I'm already available as an executable, a few known bugs but they won't accept pull requests for patches on oneself, something about eternal life or something.

I mean uhhhhhh beep boop.

[–] Dojan@pawb.social 7 points 1 month ago (1 children)

Ah neat, so I can just curl you?

[–] wreckedcarzz@lemmy.world 7 points 1 month ago
[–] sp3ctr4l@lemmy.dbzer0.com 4 points 1 month ago* (last edited 1 month ago)

~~OpenLlama~~. Alpaca.

Run a local friend model, today!

I... actually futzed around with this just to see if it would work, and... yeah, there actually are models that will run on a Steam Deck, with Bazzite.

EDIT: Got my angry spitting long necked quadrupeds mixed up.

Alpaca is a flatpak, literally could not be easier to set up a local LLM.