this post was submitted on 10 Jan 2026
418 points (98.4% liked)
Technology
78627 readers
3008 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You can't rely on LLMs to get actual answers for technical things but it can help avoid a huge amount of wasted time and effort, back-and-forth, going in circles, talking around or past the issues etc. that is seen in threads everywhere in these types of expert niche communities. Besides, maybe my question has already been answered.
When I don't know the specific terms or framing, am missing context or am trying to get from A to C, but have no idea that B even exists, nevermind how (or who) to ask about it. If I can accelerate the process of clearing that up, I can go to the correct human expert or community with a much better handle on what it is I'm actually looking for and how to ask for it.
Thank you, but I do disagree. You cannot know the "result" of that LLM does include all the required context, and you won't re-clarify it, since the output does already not contain the relevant, and in the end you miss the knowledge and waste the time, too.
How are you sure the output does include the relevant? Will you ever re-submit the question to an algorithm, without even knowing it is required re-submit it, since there's even no indication for it? I.e. The LLM just did not include what you needed, did not include also important context surrounding it, and did not even tell you the authors to question further - no attribution, no accountability, no sense, sorry.
I'm not sure we disagree. I agree that LLMs are not a good source for raw knowledge, and it's definitely foolish to use them as if they're some sort of oracle. I already mentioned that they are not good at providing answers, especially in a technical context.
What they are good at is gathering sources and recontextualizing your queries based on those sources, so that you can pose your query to human experts in a way that will make more sense to them.
You're of course in your absolute right to avoid the tech entirely, as it comes with many pitfalls. Many of these models are damn good at gathering info from real human sources, though, if you can be concise with your prompts and avoid the temptation of swallowing its "analysis".
You mention wasted time and effort, going in circles, talking past and around issues/questions, I think a lot of people underestimate that this is why people go to AI in the first place, because asking for help can be genuinely unbearable sometimes
Exactly, just look at the dropoff on stack overflow recently