this post was submitted on 02 Apr 2026
30 points (91.7% liked)

Programming

26326 readers
252 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
 

I've heard it here at 2:08

https://www.youtube.com/watch?v=mBHRPeg8zPU&t=131

I believe opencode has a more established community and will probably incorporate the improvements from the other projects. What do you think?

you are viewing a single comment's thread
view the rest of the comments
[–] Kissaki@programming.dev 42 points 1 day ago (2 children)

A code leak doesn't give a code and product use license. Any project and product use based on the leaked code is less stable and safe than other solid projects under clear terms. OpenCode is not obsolete.

[–] misk@piefed.social 5 points 1 day ago* (last edited 1 day ago) (3 children)

What if license and copyright was washed by using an LLM to translate Claude into another language?

Either way, Claude can’t be copyrighted because it’s a product of an LLM.

[–] GamingChairModel@lemmy.world 3 points 15 hours ago* (last edited 15 hours ago)

What if license and copyright was washed by using an LLM to translate Claude into another language?

The law doesn't allow you to launder copyright like that. That's just a derivative work, which can be restricted by the copyright holder in the original. As an example, in fictional writing, distinct characters are copyrighted, and using an LLM to generate new works using those copyrighted characters would still be a derivative work that the original copyright owner would have the right to deny distribution.

So if you have a copyrighted codebase and you try to implement that codebase using some kind of transformation of that code, that'd still be a derivative work and infringe the original copyright.

Now if you have some kind of clean room implementation where you can show that it was written without copying the original code itself, only working to implement its functionality through documentation/reverse engineering how the code worked, you'd be able to escape out of calling it a derivative work and could distribute it without the original copyright holder's permission (Compaq did this with the IBM BIOS to make unauthorized/unlicensed PC clones, and Google did this with the Java API to make Android without a license from Sun/Oracle and won at the Supreme Court).

Claude can’t be copyrighted because it’s a product of an LLM.

No, because Claude's code is still created by humans with the assistance of non-human tools. There's a spectrum from spelling correction and tab completion in IDEs all the way to full vibe coding with a prompt describing the raw functionality (where the prompt is so uncreative that it isn't itself copyrightable). Anthropic has never claimed that there was no human in the loop, or that the prompts it uses are so uncreative and purely functional so that the outputs aren't copyrightable.

[–] Kissaki@programming.dev 6 points 22 hours ago (1 children)

Claude can’t be copyrighted because it’s a product of an LLM

You claim Claude itself was coded by an LLM (exclusively)?

[–] misk@piefed.social 8 points 20 hours ago* (last edited 20 hours ago)

Anthropic claims that. Sucks for them now, but boy did it do wonders for their marketing.

[–] litchralee@sh.itjust.works 8 points 1 day ago* (last edited 1 day ago) (1 children)

That is an opinion, but certainly isn't settled law in any jurisdiction. Indeed, the answer to whether some, all, or none of an LLM's output is ever copyrightable and under what terms is the billion dollar question.

A project that incorporates code with shaky legal foundation will find it tough to convince others to contribute, if it's possible one day that their contributions were in vain. The right answer would be to extricate such code upon discovery, like what OpenBSD had to do when the IPFilter license turned out to be incompatible with the project.

[–] misk@piefed.social 2 points 23 hours ago

It is Anthropic’s whole business model though.

[–] GreenBeanMachine@lemmy.world 4 points 1 day ago* (last edited 1 day ago) (3 children)

What if you get Claude to implement a brand new codebase, using the leaked code as inspiration and then fix any slop issues?

[–] TerrorBite@pawb.social 2 points 16 hours ago

This has already been done.

https://github.com/Kuberwastaken/claurst

They got one AI agent to read the code and output documentation (a specification) that contained no actual code.

Then they fed that to a second AI agent and asked it to implement that specification in Rust. Technically, this is a cleanroom implementation.

This is interesting from a legal perspective because it leverages the same legal loophole that Anthropic rely on for their own operations. They can't take down this repo without creating a precedent that will be used against them.

[–] 30p87@feddit.org 1 points 16 hours ago* (last edited 16 hours ago)

The original code is reportedly 100% slop. Slop cannot be copyrighted in the US, and probably any other half-sane country. So double slop is ofc also not licensed.

[–] plateee@piefed.social 3 points 1 day ago (1 children)

Depending on how you're taking "inspiration" I'm fairly certain you can get sued if Anthropic feels like it.

Clean room development is there for a reason (although can you use AI in such an effort?)

[–] GreenBeanMachine@lemmy.world 3 points 1 day ago* (last edited 1 day ago)

All slop code is stolen anyway, also I am not writing it, I just asked their own AI to have a look at that codebase and implement a brand new app that does exactly what their app does. I am not writing the code, nor did I ever instruct the slop machine to duplicate or steal the code. I would word the prompt in such a way that the whole responsibility of any code copied would be the AI's fault. Would Anthropic sue themselves for their own AI spitting out copyrighted code? Apparently you don't have to be an engineer anymore to develop apps either, so I don't need to review the slop code either, it's a black box and as long as it does exactly what their app does, I am not responsible, I didn't write it.