this post was submitted on 20 May 2025
1761 points (98.2% liked)

Microblog Memes

7667 readers
2037 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] cm0002@lemmy.world 23 points 2 days ago (8 children)

I wouldn't mind a decent LOCAL open source AI helping

[–] thatKamGuy@sh.itjust.works 7 points 2 days ago (2 children)

DeepSeek’s model is open-sourced and can be run locally; though I think there some bits related to its training data they have been kept obscured (if I remember correctly) - likely due to the dubious nature of how it was acquired.

[–] brucethemoose@lemmy.world 2 points 1 day ago* (last edited 1 day ago)

some bits related to its training data

AKA ANY details about its training data, and its training hyperparameters, and literally any other details about its training. An 'open' secret among LLM tinkerers is that the Chinese companies seem to have particularly strong English/Chinese training data (not so much other languages though), and I'll give you one guess on how.

Deepseek is unusal in that they are open sourcing the general techniques they used and even some (not all) of the software frameworks they use.

Don't get me wrong, I think any level of openness should be encouraged (unlike OpenAI being as closed as physically possible), but they are still very closed. Unlike, say, IBM Granite models which should be reproducible.

load more comments (1 replies)
load more comments (6 replies)