this post was submitted on 08 Mar 2026
651 points (95.4% liked)

Off My Chest

1822 readers
94 users here now

RULES:


I am looking for mods!


1. The "good" part of our community means we are pro-empathy and anti-harassment. However, we don't intend to make this a "safe space" where everyone has to be a saint. Sh*t happens, and life is messy. That's why we get things off our chests.

2. Bigotry is not allowed. That includes racism, sexism, ableism, homophobia, transphobia, xenophobia, and religiophobia. (If you want to vent about religion, that's fine; but religion is not inherently evil.)

3. Frustrated, venting, or angry posts are still welcome.

4. Posts and comments that bait, threaten, or incite harassment are not allowed.

5. If anyone offers mental, medical, or professional advice here, please remember to take it with a grain of salt. Seek out real professionals if needed.

6. Please put NSFW behind NSFW tags.


founded 2 years ago
MODERATORS
 

I’ve been working with so many students who turn to it as a first resort for everything. The second a problem stumps them, it’s AI. The first source for research is AI.

It’s not even about the tech, there’s just something about not wanting to learn that deeply upsets me. It’s not really something I can understand. There is no reason to avoid getting better at writing.

you are viewing a single comment's thread
view the rest of the comments
[–] wpb@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

My main point there is that when evaluating the impact of some tool, I look at how it is used rather than how it could be used. Arguments like 'if people were to use it like this or that...' are not so interesting to me. What I care about is what the actual impact of a thing is, and for that, the only thing that matters is how people actually use it.

Now, a separate thing is my assessment of how people actually use generative AI, and whether I consider the things they do with it a boon for society. I see:

  • students and juniors, but also experienced workers, deskilling at an alarming rate
  • CEOs using it as a pretext for massive layoffs
  • a dead internet which has become a minefield of disinformation (yes it already was, but now even moreso)
  • a wash of uninspired art and blogs
  • the software crisis deepening. 80% of software goes unused. Huge waste of potential and resources. This worsens now that we can crank out buggy half formed ideas that no one asked for at a much higher rate, except now we also burn the equivalent of a rainforest to do it

I don't like these actual things that people are actually using gen AI for. Maybe you see LLMs having different effects and have a different, more positive, assessment. But you cannot separate the assessment of a tool from its users and how they use it, because they're exactly the ones that'll be using it, and they'll use it the way they use it.