this post was submitted on 19 May 2025
1501 points (98.1% liked)

Microblog Memes

7675 readers
2068 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] eugenevdebs@lemmy.dbzer0.com 1 points 4 days ago (8 children)

My hot take on students graduating college using AI is this: if a subject can be passed using ChatGPT, then it's a trash subject. If a whole course can be passed using ChatGPT, then it's a trash course.

It's not that difficult to put together a course that cannot be completed using AI. All you need is to give a sh!t about the subject you're teaching. What if the teacher, instead of assignments, had everyone sit down at the end of the semester in a room, and had them put together the essay on the spot, based on what they've learned so far? No phones, no internet, just the paper, pencil, and you. Those using ChatGPT will never pass that course.

As damaging as AI can be, I think it also exposes a lot of systemic issues with education. Students feeling the need to complete assignments using AI could do so for a number of reasons:

  • students feel like the task is pointless busywork, in which case a) they are correct, or b) the teacher did not properly explain the task's benefit to them.

  • students just aren't interested in learning, either because a) the subject is pointless filler (I've been there before), or b) the course is badly designed, to the point where even a rote algorithm can complete it, or c) said students shouldn't be in college in the first place.

Higher education should be a place of learning for those who want to further their knowledge, profession, and so on. However, right now college is treated as this mandatory rite of passage to the world of work for most people. It doesn't matter how meaningless the course, or how little you've actually learned, for many people having a degree is absolutely necessary to find a job. I think that's bullcrap.

If you don't want students graduating with ChatGPT, then design your courses properly, cut the filler from the curriculum, and make sure only those are enrolled who are actually interested in what is being taught.

[–] BigPotato@lemmy.world 7 points 4 days ago (1 children)

Your 'design courses properly' loses all steam when you realize there has to be an intro level course to everything. Show me math that a computer can't do but a human can. Show me a famous poem that doesn't have pages of literary critique written about it. "Oh, if your course involves Shakespeare it's obviously trash."

The "AI" is trained on human writing, of course it can find a C average answer to a question about a degree. A fucking degree doesn't need to be based on cutting edge research - you need a standard to grade something on anyway. You don't know things until you learn them and not everyone learns the same things at the same time. Of course an AI trained on all written works within... the Internet is going to be able to pass an intro level course. Or do we just start students with a capstone in theoretical physics?

[–] jmf@lemm.ee 2 points 4 days ago (1 children)

AI is not going to change these courses at all. These intro courses have always had all the answers all over the internet already far before AI showed up, at least at my university they did. If students want to cheat themselves out of those classes, they could before AI and will continue to do so after. There will always be students who are willing to use those easier intro courses to better themselves.

[–] eugenevdebs@lemmy.dbzer0.com 0 points 4 days ago

These intro courses have always had all the answers all over the internet already far before AI showed up, at least at my university they did.

I took a political science class in 2018 that had questions the professor wrote in 2010.

And he often asked the questions to be answered before we got them in the class. So sometimes I'd go "what the fuck is he referencing? This wasn't covered. It's not in my notes."

And then I'd just check the question and someone already had the answers up from 2014.

load more comments (6 replies)