Replit? The online code compiler/interpreter website? I guess I'm not entirely surprised that they turned to slop.
Technology
Which posts fit here?
Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.
Post guidelines
[Opinion] prefix
Opinion (op-ed) articles must use [Opinion] prefix before the title.
Rules
1. English only
Title and associated content has to be in English.
2. Use original link
Post URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communication
All communication has to be respectful of differing opinions, viewpoints, and experiences.
4. Inclusivity
Everyone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacks
Any kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangents
Stay on topic. Keep it relevant.
7. Instance rules may apply
If something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.
Companion communities
!globalnews@lemmy.zip
!interestingshare@lemmy.zip
Icon attribution | Banner attribution
If someone is interested in moderating this community, message @brikox@lemmy.zip.
Is this real? Because it sounds like some low effort satire about blindly trusting LLMs and the totally expected outcomes. Surely no one can be this naive.
User used the same db for prod and dev, user has no backup, LLM with db access is deleting it, user interacts with the LLM like it’s a human and ask it to apologize and follow promises of not doing it…… Oh and user doesn’t use git or any code linting/control.
But yeah it is the llm fault /s What is scary is this is the tip of the iceberg. I foresee a lot of security problems in the future if software development goes that way.
He actually did have a backup, because the company is only normal-stupid and not deliberate-stupid, they had a DB checkpoint he could roll back to.
The LLM, of course, went with the path of least resistance once it started down the "oh no I fucked up" completion prompt, and claimed they had no such checkpoint.
Don't use LLM for fact things, kids.
Right? It's crazy to think these kinds of people exist, but they are real and they make decisions for other people.
“The [AI] safety stuff is more visceral to me after a weekend of vibe hacking,” Lemkin said. I explicitly told it eleven times in ALL CAPS not to do this. I am a little worried about safety now.
vIbE hAcKiNg
I wish these people understood how fucking stupid they sound.
I doubt their claim. How does LLM communicate directly to different systems in their infrastructure? What even promts it to act to begin with?
Unless they went out of their way creating such interface for some reason, it is plain bullshit and human error, or a coverup by a skinbag CEO. He made screenshots of LLM taking the blame on itself that, as a concept, completely impossible, and we belive his lying ass lips. If only he asked it, at what stage AI is now, he could've lied better.
It wasn't the user's infrastructure, it was the LLM company's. The selling point is that it's all integrated together for you. You explain what you want, the LLM not only codes it, but launches it too. Yes, his screen shots of the LLM "taking responsibility" are idiotic, but so many people don't understand that LLMs don't actually understand anything.
Text wasn't written at 45°, only all caps is not enougj
And nothing of value was lost…