this post was submitted on 10 Jan 2026
418 points (98.4% liked)

Technology

78627 readers
2929 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ScrambledEggs@lazysoci.al 15 points 4 days ago (2 children)

I wish I learned how to hack

[–] frongt@lemmy.zip 16 points 4 days ago (1 children)
[–] Brunbrun6766@lemmy.world 16 points 4 days ago (1 children)
[–] Truscape@lemmy.blahaj.zone 15 points 4 days ago

If there's any lesson we should have been taught by this timeline, it's that even the moronic can spark large changes with enough spite.

(Although the more you learn, the less wasted spite you need XD)

[–] vatlark@lemmy.world 6 points 4 days ago* (last edited 4 days ago) (2 children)

The online communities are typically great. If you get really stuck, LLMs can be nice for dealing with your specific confusion.

Edit: ... but it's better to ask the community so others can benefit from the answer.

[–] artwork@lemmy.world 10 points 3 days ago* (last edited 3 days ago) (1 children)

Please no. Absolutely not. LLM is absolutely not "nice for dealing with confusion" but the very opposite.
Please do consider people effort, articles, attributions, and actually learning and organizing your knowledge. Please do train your mind, and self-confidence.

[–] some_kind_of_guy@lemmy.world 3 points 2 days ago (2 children)

You can't rely on LLMs to get actual answers for technical things but it can help avoid a huge amount of wasted time and effort, back-and-forth, going in circles, talking around or past the issues etc. that is seen in threads everywhere in these types of expert niche communities. Besides, maybe my question has already been answered.

When I don't know the specific terms or framing, am missing context or am trying to get from A to C, but have no idea that B even exists, nevermind how (or who) to ask about it. If I can accelerate the process of clearing that up, I can go to the correct human expert or community with a much better handle on what it is I'm actually looking for and how to ask for it.

[–] artwork@lemmy.world 1 points 22 hours ago* (last edited 22 hours ago) (1 children)

Thank you, but I do disagree. You cannot know the "result" of that LLM does include all the required context, and you won't re-clarify it, since the output does already not contain the relevant, and in the end you miss the knowledge and waste the time, too.

How are you sure the output does include the relevant? Will you ever re-submit the question to an algorithm, without even knowing it is required re-submit it, since there's even no indication for it? I.e. The LLM just did not include what you needed, did not include also important context surrounding it, and did not even tell you the authors to question further - no attribution, no accountability, no sense, sorry.

[–] some_kind_of_guy@lemmy.world 0 points 13 hours ago

I'm not sure we disagree. I agree that LLMs are not a good source for raw knowledge, and it's definitely foolish to use them as if they're some sort of oracle. I already mentioned that they are not good at providing answers, especially in a technical context.

What they are good at is gathering sources and recontextualizing your queries based on those sources, so that you can pose your query to human experts in a way that will make more sense to them.

You're of course in your absolute right to avoid the tech entirely, as it comes with many pitfalls. Many of these models are damn good at gathering info from real human sources, though, if you can be concise with your prompts and avoid the temptation of swallowing its "analysis".

[–] sureshot0@discuss.online 2 points 2 days ago (1 children)

You mention wasted time and effort, going in circles, talking past and around issues/questions, I think a lot of people underestimate that this is why people go to AI in the first place, because asking for help can be genuinely unbearable sometimes

[–] some_kind_of_guy@lemmy.world 0 points 2 days ago

Exactly, just look at the dropoff on stack overflow recently

[–] GreenKnight23@lemmy.world 2 points 2 days ago (1 children)

that's hilarious. I remember the scene back in the day was more like, "if you don't know, get fucked because nobody is going to be responsible for your incompetent bullshit."

oh how times have changed.

[–] Cruel@programming.dev 2 points 2 days ago

They have lots of snobbish gatekeeping still, it just exists at a higher level. Entry level knowledge is abundant. But once you seek a community with more specialized expertise, the IRC channels will be private and have passwords, and you better have contributed something to a novel exploit or something...