this post was submitted on 09 Jan 2026
92 points (100.0% liked)
Programming
24386 readers
497 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean it's kind of obvious... they are giving their LLMs simulators. access to test etc..., IE chat gpt can run code in a python environment and detect errors. but obviously it can't know what the intention is, so it's inevitably going to stop when it gets it's first "working" result.
of course I'm sure further issues will come from incestuous code... IE AIs train on all publicly listed github code.
Vibe coders begin working on a lot of "projects" that they upload to github. now new AI can pick up all the mistakes of it's predicesors on top of making it's new ones.