this post was submitted on 16 May 2025
175 points (97.3% liked)

LocalLLaMA

3227 readers
2 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

Rules:

Rule 1 - No harassment or personal character attacks of community members. I.E no namecalling, no generalizing entire groups of people that make up our community, no baseless personal insults.

Rule 2 - No comparing artificial intelligence/machine learning models to cryptocurrency. I.E no comparing the usefulness of models to that of NFTs, no comparing the resource usage required to train a model is anything close to maintaining a blockchain/ mining for crypto, no implying its just a fad/bubble that will leave people with nothing of value when it burst.

Rule 3 - No comparing artificial intelligence/machine learning to simple text prediction algorithms. I.E statements such as "llms are basically just simple text predictions like what your phone keyboard autocorrect uses, and they're still using the same algorithms since <over 10 years ago>.

Rule 4 - No implying that models are devoid of purpose or potential for enriching peoples lives.

founded 2 years ago
MODERATORS
 

from 10b0t0mized: I miss the days when I had to go through a humiliation ritual before getting my questions answered.

Now days you can just ask your questions from an infinitely patient entity, AI is really terrible.

you are viewing a single comment's thread
view the rest of the comments
[–] pennomi@lemmy.world 41 points 1 month ago (2 children)

The fast drop yes, but really it’s been in decline for around a decade before that.

[–] MrZee@lemm.ee 12 points 1 month ago (2 children)

Interesting! When I first read your comment, I looked at the chart and thought “it looks to me like the drop starts at the end of 2022. Isn’t that before LLMs started being used broadly?”

Nope. Looks like ChatGPT was released in November 2022. It doesnt feel like it’s been around that long, but I guess it has.

[–] MudMan@fedia.io 6 points 1 month ago

The drop starts in 2013, but people were certainly ready to all bail at once by the time LLMs came around.

[–] Vince@lemmy.world 11 points 1 month ago (1 children)

That sucks, is there an alternative people are using? seems like it would still be a useful knowledge base to have

[–] HellieSkellie@lemmy.dbzer0.com 3 points 1 month ago (1 children)

The common alternative is to just ask ChatGPT your software questions, get false information from the AI, and then try and push that horrible code to production anyway if my past two jobs are any indicator.

Stack Overflow is still useful to find old answers, but fucking sucks to ask new questions on. If you aren't getting an AI answer to your question, then you're getting your question deleted for some made up reason.

The real answer that everyone hates is: If you have a question about something, read the documentation and experiment with it to figure that something out. If the documentation seems wrong, submit an issue report to the devs (usually on GitHub) and see what they say.

The secondary answer is that almost everything FOSS has a slack channel or even sometimes discord channels. Go to the channels and ask people who use/make whatever tool you need help with.

[–] atzanteol@sh.itjust.works 1 points 1 month ago (1 children)

The common alternative is to just ask ChatGPT your software questions, get false information from the AI, and then try and push that horrible code to production anyway if my past two jobs are any indicator.

If you have developers pushing bad and broken code to production your problem isn't AI.