this post was submitted on 29 Oct 2025
1646 points (99.7% liked)

Programmer Humor

27237 readers
679 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 

Apparently a page from an internal IBM training manual. Some further attempts at source it

(page 2) 50 comments
sorted by: hot top controversial new old
[–] borth@sh.itjust.works 5 points 1 week ago (1 children)

But a computer works for "free" so "not being held accountable" is even better!!

load more comments (1 replies)
[–] criss_cross@lemmy.world 5 points 1 week ago

Sorry can’t hear you as AI brrs over hiring applications and performance reviews

[–] Agent641@lemmy.world 5 points 1 week ago (2 children)

"No networked computers!" Colonial fleet high command standing orders

[–] Ceruleum@lemmy.wtf 2 points 1 week ago

Cylons hate this little trick.

One of many reasons why I love BSG. As a retro-computing enthusiast, the idea that antique systems are naturally impervious to conventional digital attacks, just felt so validating.

Sure, our navigation system is based on a Commodore-64, but good luck getting it to divulge mission-critical information over bluetooth. Or any information for that matter.

[–] MalReynolds@piefed.social 4 points 1 week ago (1 children)

Again, weapons without human in the loop needs to be against the Geneva convention, yesterday. Or articles of war , something. This is a tractable problem, that needs attention, now, It will not end well and can actually be (mostly, by honorable armies) fixed.

[–] ulterno@programming.dev 3 points 1 week ago* (last edited 1 week ago)

Geneva convention can only be applied on the nations who are coincidentally, not going around breaking them willy-nilly.

[–] onnekas@sopuli.xyz 4 points 1 week ago* (last edited 1 week ago) (2 children)

I generally agree.

Imagine however, that a machine objectively makes the better decisions than any person. Should we then still trust the humans decision just to have someone who is accountable?

What is the worth of having someone who is accountable anyway? Isn't accountability just an incentive for humans to not just fuck things up? It's also nice for pointing fingers if things go bad - but is there actually any value in that?

Additionally: there is always a person who either made the machine or deployed the machine. IMO the people who deploy a machine and decide that this machine will now be making decisions should be accountable for those actions.

[–] petrol_sniff_king@lemmy.blahaj.zone 2 points 1 week ago (2 children)

Imagine however, that a machine objectively makes the better decisions than any person.

You can't know if a decision is good or bad without a person to evaluate it. The situation you're describing isn't possible.

the people who deploy a machine [...] should be accountable for those actions.

How is this meaningfully different from just having them make the decisions in the first place? Are they too stupid?

load more comments (2 replies)
[–] x00z@lemmy.world 3 points 1 week ago

Well, I might get disliked for this opinion, but in some cases it's perfectly fine for a computer to make a management decision. However, this should also mean that the person in charge of said computer, or the one putting the decision by the computer into actual action, should be the one that gets held responsible. There's also the thing where it should be questioned how responsible it is to even consider the management decisions of a computer in a specific field. What I'm saying is that there's no black and white answer here.

[–] Appoxo@lemmy.dbzer0.com 2 points 1 week ago (4 children)

You are essentially saying
"Management is essential, replace the common work force with AI"

Well...If I get fired, I will hold you accountable!

[–] GreenKnight23@lemmy.world 1 points 1 week ago

your logic is flawed.

employees can be held accountable for their actions.

load more comments (3 replies)
[–] ThatGuy46475@lemmy.world 2 points 1 week ago

The directors not going to like this

[–] csm10495@sh.itjust.works 2 points 1 week ago

I've thought about this wrt to AI and work. Every time I sit in a post mortem it's about human errors and process fixes.

The day a post mortem ends with "well the AI did it so nothing we can do" is the day I look towards.. with dread.

[–] BlameTheAntifa@lemmy.world 1 points 1 week ago

I feel this way about things like companies, too. It must always be human beings that bear the personal responsibility for an organization’s crimes, not “the company” alone. When money can pay in lieu of personal responsibility, then there is no justice or accountability.

[–] the_q@lemmy.zip 1 points 1 week ago

Until Skynet happens.

[–] Credibly_Human@lemmy.world 1 points 1 week ago (10 children)

I don't think this is wise at all.

Its just people putting into words their wish to be able to punish and appoint blame above their wishes to be pragmatic.

If software is better at something, there is no reason to be mad at that software.

More than that, the idea that the software vendor could not be held liable is farcical. Of course they could be, or the company running said software. In fact, they'd probably get more shit than managers who regularly get away with ridiculous shit.

I mean wage theft is the biggest form of theft for a reason, and none of the wage thieves are machines (or at least most aren't).

load more comments (10 replies)
load more comments
view more: ‹ prev next ›