this post was submitted on 05 Jun 2025
115 points (99.1% liked)

Programming Circlejerk

204 readers
2 users here now

Community to talk about enlightened programming takes

Rules:

founded 4 months ago
MODERATORS
top 30 comments
sorted by: hot top controversial new old
[–] Darkard@lemmy.world 89 points 1 week ago (4 children)

One of the worst things about coding is having to pick apart someone else broken code.

So why the fuck would I want to accelerate my work to THAT point?

You know why designers and PMs like AI code? Because they don't know what the fuck they are doing, they dont have to try and stitch that junk into 15 years of legacy code and they dont have to debug that shit.

"Actually Darkard, I ran this request into GPT and it came back with this? It's only short and most of it has already been done here, so I think your story point estimate is wrong?"

Fuuuuuck oooooooffffffff

[–] okwhateverdude@lemmy.world 30 points 1 week ago (1 children)

It's okay, bro. Let it out.

[–] Darkard@lemmy.world 25 points 1 week ago

I just need Gareth to stop telling me that he knows how to code when he thinks a git push is what you do when you want grandpa's inheritance early.

[–] Zachariah@lemmy.world 9 points 1 week ago

I’d much rather pick apart my own broken code.

[–] FizzyOrange@programming.dev 8 points 1 week ago* (last edited 1 week ago) (2 children)

is having to pick apart someone else broken code.

I agree, but also I do find that AI's broken code is generally waaay less annoying to pick apart than my colleagues' code. I'm not sure exactly why. Probably partly because it's better at commenting code and naming variables so it's easier to follow?

I think also partly it's because reviewing other people's code is usually done during code review, where you can't just directly edit the code to fix it - you have to start a conversation convincing them to do it differently. That's quite annoying and doesn't happen with AI generated code.

[–] sturger@sh.itjust.works 11 points 1 week ago (2 children)

I still don't understand why we're using humans to review AI code. Shouldn't AIs be reviewing the code?

We're letting AIs do the fun part (coding) and forcing humans to review (the worst part) reams more janky code.

[–] greenskye@lemm.ee 7 points 1 week ago

AI's get the fun part of everything right now. AI gets writing, humans get editing. AIs get drawing, humans get fixing hands and details, etc etc

[–] FizzyOrange@programming.dev 4 points 1 week ago (1 children)

AI's aren't smart enough yet. But plenty of people are also using them to review code.

[–] Beacon@fedia.io 1 points 1 week ago (2 children)

What are the results if you take code written by one brand of ai and then have another brand of ai review it? Like use chatgpt to write code, and then ask copilot if the generated code has any errors and will work as intended?

[–] NoForwardslashS@sopuli.xyz 4 points 1 week ago

I don't know, if I put my hand in a fan and then put that mutilated hand into another brand of fan, do you think that might fix it?

[–] FizzyOrange@programming.dev 2 points 1 week ago

Interesting idea, I've never tried that. I feel like it wouldn't be a silver bullet but you might get slightly better results I guess.

[–] Beacon@fedia.io 1 points 1 week ago (2 children)

I'm not a programmer so i don't know if this makes sense, but I wonder if it's easier to retool ai code because ai code is janky in a similar-ish way most of the time, while human code is janky in different ways all the time? Whadda ya think?

[–] Windex007@lemmy.world 5 points 1 week ago

I disagree with the premise.

AI is good at making things that LOOK right. Pictures. Words. Whatever. Actually makes errors harder to find IMO.

[–] FizzyOrange@programming.dev 2 points 1 week ago

Yeah definitely could be. I also think when AI gets things wrong it gets it so obviously wrong you have to delete it and do it yourself (and not worry about offending someone). It rarely seems to make the same kinds of trivial mistakes humans do (like copy/paste errors for example). It either does a pretty decent job that's easy to fix up, or it totally fails and you do it yourself.

[–] pinball_wizard@lemmy.zip 4 points 1 week ago* (last edited 1 week ago)

You know why designers and PMs like AI code? Because they don't know what the fuck they are doing,

I just want to highlight this for any designers or PMs reading along.

In the same breath, I want to invite my designer colleagues to try out this amazing designing script I wrote. It'll save them a ton of time, I bet. (This is sarcasm.)

I actually respect the difficultly of designers jobs.

Even while many of them don't respect the difficultly of mine.

Oh well. I'll get paid either way, in the end, because this shit all breaks when it's done wrong.

[–] rayquetzalcoatl@lemmy.world 37 points 1 week ago (3 children)

My bosses keep trying to get me to do this. They don't understand that the work they give me is complicated and niche. The chatbots do not help at all, ever. It's shit.

I don't understand why these numbskulls pay me if they think my job is so easy that a fucking chatbot can do it.

[–] Jimmycakes@lemmy.world 15 points 1 week ago

They need to see if it can be done before they fire you. But if you're not trying they can't start seeing. So you need to set least try for them to see.

[–] ech@lemm.ee 6 points 1 week ago* (last edited 1 week ago)

I don't understand why these numbskulls pay me if they think my job is so easy that a fucking chatbot can do it.

Call me paranoid, but it feels like having professionals consistently brute force the output from these systems into something usable would provide a rich resource to improve them to the point that they can.

[–] mojofrododojo@lemmy.world 6 points 1 week ago

they want you to model your work processes so the AI can duplicate you.

[–] some_guy@lemmy.sdf.org 13 points 1 week ago

Psycho manager.

[–] jet@hackertalks.com 11 points 1 week ago

Because some people think it's all about the prompts. If you get all of your highly talented, highly paid, irreplaceable people to write their jobs out in prompts so that the AI can do it. You can fire them and keep the prompts, then just have an entry level person use those same prompts to do the same work...

Of course, that's the dream, reality is one hallucination away

[–] turbowafflz@lemmy.world 10 points 1 week ago

I tried to get chatgpt to help me write fvwmscript a week or so ago and literally all of the output it gave was hallucination, none of it was valid or even made any sense in context. It vaguely looked like fvwmscript but that was the only similarity it had to real code

[–] andybytes@programming.dev 8 points 1 week ago

I used AI today for a short period. Then I gave it up because it kept giving me stupid ideas and I just did it myself. I have yet to be impressed by literally anything other than the sinister abilities of its uses as a surveillance tool, A tool in the US Nuclear Program and the dragnet. Even Palantir, Peter Thiel's little monster he created is inaccurate and costly. His little tool scrapes Twitter and it uses osint accounts and it ended up killing a bunch of meaningless civilians. The person who owned the OSINT account donated money to charity cause they felt guilty. So coming from just a business perspective, from fiscal realities, it is clear that most of this AI bullshit is just a bunch of buzzwords and nonsense. A way for big business to just lower the bar, lower standards crush small business and hollow out America. AI has yet to actually show me something that I find valuable. The bubble it gonna pop.

[–] vane@lemmy.world 7 points 1 week ago (1 children)

I hope one day he will order sandwich from machine generated app and machine will put allergic ingredients in his sandwich and he will die so we fucking understand what are consequences of this fuckery with technology.

[–] Cort@lemmy.world 1 points 1 week ago

AI: but sir you said no peanuts and tree nuts. This is just the peanuts. Have a good day!

[–] Lembot_0003@lemmy.zip 6 points 1 week ago (1 children)

What is that "Warp"? I want to know to be sure not to use their software...

[–] b_van_b@programming.dev 5 points 1 week ago (2 children)
[–] Lembot_0003@lemmy.zip 4 points 1 week ago (1 children)

Hm, actually not the worst idea, but no way I would enter some AI-gibberish in the console.

[–] turbowafflz@lemmy.world 9 points 1 week ago

I don't understand why it's closed source, who would ever use a closed source terminal when there's a nearly infinite number of open ones

Neat idea. Idiot CEO.