this post was submitted on 13 Feb 2026
1166 points (95.3% liked)

Programmer Humor

29921 readers
491 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] psud@aussie.zone 7 points 4 days ago

I hope this is tested in court and found to be correct

[–] LeFantome@programming.dev 1 points 6 days ago (1 children)

So, if I wrote an AI preface to somebody else’s book, they lose their copyright?

Seems very unlikely. Can you cite any case law for this?

[–] Hack3900@lemy.lol 2 points 4 days ago

I think it would depend on if there's a way to differentiate what parts are ai generated or not (the preface could be part of public domain but not the rest of the book)

[–] HappyFrog@lemmy.blahaj.zone 306 points 1 week ago (44 children)

As much as I wish this was true, I don't really think it is.

[–] lung@lemmy.world 223 points 1 week ago (2 children)

It's just unsettled law, and the link is basically an opinion piece. But guess who wins major legal battles like this - yep, the big corps. There's only one way this is going to go for AI generated code

[–] psud@aussie.zone 2 points 4 days ago (1 children)

One would hope it could be first tested against a small time company that can't afford a good lawyer

[–] joker54@lemmy.dbzer0.com 1 points 3 days ago

Sadly, any lawsuit that opposed AI will have an army of lawyers defending the AI company.

Precedent affects all, and big companies know this.

[–] Droechai@piefed.blahaj.zone 27 points 1 week ago (1 children)

Worst case is that its the owner of the agent that recieves the copyright, so all vibe coded stuff outside local ai will be claimed by the big corpos

[–] Grail@multiverse.soulism.net 20 points 1 week ago (4 children)

I actually think that's the best case because it would kill enterprise adoption of AI overnight. All the corps with in-house AI keep using and pushing it, but every small to medium business that isn't running AI locally will throw it out like yesterday's trash. OpenAI's stock price will soar and then plummet.

[–] Grimy@lemmy.world 27 points 1 week ago (1 children)

The big AI companies would just come out with a business subscription that explicitly gives you copyright.

load more comments (1 replies)
load more comments (3 replies)
[–] PlzGivHugs@sh.itjust.works 39 points 1 week ago (2 children)

It is true that AI work (and anything derived from it that isn't significantly transformative) is public domain. That said, the copyright of code that is a mix of AI and human is much more legally grey.

In other work, where it can be more separated, individual elements may have different copyright. For example, a comic was made using AI generated images. It was ruled that all the images were thus public domain. Despite that, the text and the layout of the comic was human-made and so the copyright to that was owned by the author. Code, obviously can't be so easily divided up, and it will be much harder to define what is transformative or not. As such, its a legal grey area that will probably depend on a case-by-case basis.

[–] ssfckdt@lemmy.blahaj.zone 19 points 1 week ago

Yeah, it's like products that include FOSS in them, only have to release the FOSS stuff, not their proprietary. (Was kind of cute to find the whole GNU license buried in the menus of my old TiVo...)

load more comments (1 replies)
load more comments (42 replies)
[–] ZILtoid1991@lemmy.world 82 points 1 week ago (3 children)

I think, to punish Micro$lop for its collaboration with fascists and its monopolistic behavior, the whole Windows codebase should be made public domain.

[–] whyNotSquirrel@sh.itjust.works 40 points 1 week ago (3 children)

does the public really want more garbage than they already has?

[–] Skullgrid@lemmy.world 43 points 1 week ago

Do you not want all the hardware support Linux is missing to suddenly become available?

[–] MonkderVierte@lemmy.zip 18 points 1 week ago* (last edited 1 week ago) (1 children)

The kernel and NTFS seem decent from what i heard. Or at least was (the kernel, no guess what they vibecoded into it now).

About NTFS: it was actually pretty good for it's time (90s), but the tooling makes no use of some of it's better features and abuses some others close to breaking point. Literally pearls for the sows.

load more comments (1 replies)
load more comments (1 replies)
load more comments (2 replies)
[–] fubarx@lemmy.world 76 points 1 week ago (2 children)

This whole post has a strong 'Sovereign Citizen' vibe.

[–] brianary@lemmy.zip 19 points 1 week ago (1 children)

The Windows FOSS part, sure, but unenforceable copyright seems quite possible, but probably not court-tested. I mean, AI basically ignored copyright to train in the first place, and there is precedent for animals not getting copyright for taking pictures.

[–] CanadaPlus@lemmy.sdf.org 16 points 1 week ago* (last edited 1 week ago) (1 children)

If it's not court tested, I'm guessing we can assume a legal theory that breaks all software licensing will not hold up.

Like, maybe the code snippets that are AI-made themselves can be stolen, but not different parts of the project.

load more comments (1 replies)
[–] GalacticSushi@lemmy.blahaj.zone 15 points 1 week ago

I do not give Facebook or any entities associated with Facebook permission to use my pictures, information, messages, or posts, both past and future.

[–] RagingRobot@lemmy.world 72 points 1 week ago (2 children)

That's not even remotely true....

[–] chaogomu@lemmy.world 49 points 1 week ago (5 children)

The law is very clear that non-human generated content cannot hold copyright.

That monkey that took a picture of itself is a famous example.

But yes, the OP is missing some context. If a human was involved, say in editing the code, then that edited code can be subject to copyright. The unedited code likely cannot.

Human written code cannot be stripped of copyright protection regardless of how much AI garbage you shove in.

Still, all of this is meaningless until a few court cases happen.

load more comments (5 replies)
load more comments (1 replies)
[–] iglou@programming.dev 48 points 1 week ago* (last edited 1 week ago) (17 children)

That sounds like complete bullshit to me. Even if the logic is sound, which I seriously doubt, if you use someone's code and you claim their license isn't valid because some part of the codebase is AI generated, I'm pretty sure you'll have to prove that. Good luck.

load more comments (17 replies)
[–] biotin7@sopuli.xyz 40 points 1 week ago (13 children)

Anything built by AI/LLMs should be FOSS by law. Oh I dream of the day.

[–] Honytawk@feddit.nl 19 points 1 week ago

Your wish is granted.

But you can only view the source code through an LLM

a finger on the monkey paw curls

load more comments (12 replies)
[–] meekah@discuss.tchncs.de 35 points 1 week ago (2 children)

Aren't you all forgetting the core meaning of open source? The source code is not openly accessible, thus it can't be FOSS or even OSS

This just means microslop can't enforce their licenses, making it legal to pirate that shit

load more comments (2 replies)
[–] Fontasia@feddit.nl 32 points 1 week ago

Stallman: "Oh man, not like this."

[–] Michal@programming.dev 31 points 1 week ago (16 children)

Counterpoint: how do you even prove that any part of the code was AI generated.

Also, i made a script years ago that algorithmically generates python code from user input. Is it now considered AI-generated too?

load more comments (16 replies)
[–] mfed1122@discuss.tchncs.de 22 points 1 week ago

This reminds me of that scene in Breaking Bad where the two morons were talking about how if you ask an undercover cop if they're cop they legally have to tell you the truth

[–] ricecake@sh.itjust.works 20 points 1 week ago

That's not what that research document says. Pretty early on it talks about rote mechanical processes with no human input. By the logic they employ there's no difference between LLM code and a photographer using Photoshop.

[–] Evil_Shrubbery@thelemmy.club 20 points 1 week ago (2 children)

By that same logic LLMs themselves (by now some AI bro had to vibe code something there) & their trained datapoints (which were on stolen data anyway) should be public domain.

What revolutionary force can legislate and enforce this?? Pls!?

load more comments (2 replies)
[–] akmur@lemmy.world 18 points 1 week ago (3 children)

how can you tell if it's AI generated? you can't

load more comments (3 replies)
[–] kokesh@lemmy.world 17 points 1 week ago (2 children)

As it should. All the idiots calling themselves programmers, because they tell crappy chatbot what to write, based on stolen knowledge. What warms my heart a little is the fact that I poisoned everything I ever wrote on StackOverflow just enough to screw with AI slopbots. I hope I contributed my grain of sand into making this shit little worse.

load more comments (2 replies)
[–] phoenixz@lemmy.ca 17 points 1 week ago

So by that reasoning all Microsoft software is open source

Not that we'd want it, it's horrendously bad, but still

[–] Kazumara@discuss.tchncs.de 15 points 1 week ago (2 children)

How the hell did he arrive at the conclusion there was some sort of one-drop rule for non-protected works.

Just because the registration is blocked if you don't specify which part is the result of human creativity, doesn't mean the copyright on the part that is the result of human creativity is forfeit.

Copyright exists even before registration, registration just makes it easier to enforce. And nobody says you can't just properly refile for registration of the part that is the result of human creativity.

load more comments (2 replies)
[–] ChaoticNeutralCzech@feddit.org 14 points 1 week ago (1 children)

Windows is not even source-available. Windows XP is source-unintentionally-available thanks to a leak but there's no AI loophole in that.

load more comments (1 replies)
load more comments
view more: next ›