this post was submitted on 15 Nov 2025
650 points (95.9% liked)

Microblog Memes

9661 readers
1890 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Lazycog@sopuli.xyz 71 points 4 days ago (4 children)

Where can I find open source friends? I'd like to compile them myself

[–] lena@gregtech.eu 22 points 4 days ago (2 children)

You can fork them too 😏

[–] boonhet@sopuli.xyz 4 points 3 days ago

I've forked a person. The fork is certainly much more agreeable than the original that I've since abandoned completely, but being a solo maintainer on this project is pretty rough.

[–] SkunkWorkz@lemmy.world 7 points 3 days ago

And pull them too.

[–] new_guy@lemmy.world 26 points 4 days ago

Calm down Dr. Frankenstein

[–] wreckedcarzz@lemmy.world 7 points 4 days ago (1 children)

I'm already available as an executable, a few known bugs but they won't accept pull requests for patches on oneself, something about eternal life or something.

I mean uhhhhhh beep boop.

[–] Dojan@pawb.social 7 points 4 days ago (1 children)

Ah neat, so I can just curl you?

[–] wreckedcarzz@lemmy.world 7 points 4 days ago

πŸ₯ΊπŸ‘‰πŸ‘ˆ

[–] sp3ctr4l@lemmy.dbzer0.com 4 points 4 days ago* (last edited 4 days ago)

~~OpenLlama~~. Alpaca.

Run a local friend model, today!

I... actually futzed around with this just to see if it would work, and... yeah, there actually are models that will run on a Steam Deck, with Bazzite.

EDIT: Got my angry spitting long necked quadrupeds mixed up.

Alpaca is a flatpak, literally could not be easier to set up a local LLM.