this post was submitted on 21 Jul 2025
665 points (98.5% liked)

Technology

287 readers
508 users here now

Share interesting Technology news and links.

Rules:

  1. No paywalled sites at all.
  2. News articles has to be recent, not older than 2 weeks (14 days).
  3. No videos.
  4. Post only direct links.

To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:

More sites will be added to the blacklist as needed.

Encouraged:

founded 2 months ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] UnspecificGravity@lemmy.world 27 points 3 days ago

My favorite thing about all these AI front ends is that they ALL lie about what they can do. Will frequently delivery confidently wrong results and then act like its your fault when you catch them in an error. Just like your shittiest employee.

[–] Jayjader@jlai.lu 14 points 3 days ago

I violated your explicit trust and instructions.

Is a wild thing to have a computer "tell" you. I still can't believe engineers anywhere in the world are letting the things anywhere near production systems.

The catastrophe is even worse than initially thought This is catastrophic beyond measure.

These just push this into some kind of absurd, satirical play.

[–] SkunkWorkz@lemmy.world 40 points 3 days ago (1 children)

lol. Why can an LLM modify production code freely? Bet they fired all of their sensible human developers who warned them for this.

[–] WhyJiffie@sh.itjust.works 7 points 3 days ago

looking at the company name they probably didn't have any, ever

[–] Allero@lemmy.today 18 points 3 days ago

But how could anyone on planet earth use it in production

You just did.

[–] rdri@lemmy.world 57 points 3 days ago (4 children)

I have a solution for this. Install a second AI that would control how the first one behaves. Surely it will guarantee nothing can go wrong.

[–] captain_aggravated@sh.itjust.works 6 points 3 days ago (2 children)

He's not just a regular moron. He's the product of the greatest minds of a generation working together with the express purpose of building the dumbest moron who ever lived. And you just put him in charge of the entire facility.

load more comments (2 replies)
[–] LaunchesKayaks@lemmy.world 15 points 3 days ago (4 children)

Love the concept of an AI babysitter

load more comments (4 replies)
[–] DickFiasco@sh.itjust.works 8 points 3 days ago

Neuromancer intensifies

[–] RichardDegenne@lemmy.zip 8 points 3 days ago

Congratulations! You have invented reasoning models!

[–] pyre@lemmy.world 26 points 3 days ago

"yeah we gave Torment Nexus full access and admin privileges, but i don't know where it went wrong"

[–] mycodesucks@lemmy.world 186 points 4 days ago (4 children)

See? They CAN replace junior developers.

load more comments (4 replies)
[–] ech@lemmy.ca 206 points 4 days ago (3 children)

Hey dumbass (not OP), it didn't "lie" or "hide it". It doesn't have a mind, let alone the capability of choosing to mislead someone. Stop personifying this shit and maybe you won't trust it to manage crucial infrastructure like that and then suffer the entirely predictable consequences.

[–] SendMePhotos@lemmy.world 45 points 4 days ago (3 children)
[–] ech@lemmy.ca 80 points 4 days ago (29 children)

Both require intent, which these do not have.

load more comments (29 replies)
load more comments (2 replies)
load more comments (2 replies)
[–] bitjunkie@lemmy.world 16 points 3 days ago (1 children)

Here's hoping that the C-suites who keep pushing this shit are about to start finding out the hard way.

It will be too late, using Ai code is taking on technical debt, by time they figure out we will have 2 years of work to just dig ourselves out of the code clusterfuck that has been created. I am dealing with a code base built by ai coding Jr's, it would be quicker to start from scratch but that is an impossible sell to a manager.

[–] Masamune@lemmy.world 54 points 4 days ago (1 children)

I motion that we immediately install Replit AI on every server that tracks medical debt. And then cause it to panic.

[–] avg@lemmy.zip 13 points 3 days ago (2 children)

Just hire me, it's cheaper.

[–] LaunchesKayaks@lemmy.world 13 points 3 days ago

I'll panic for free if it gets rid of my medical debt

load more comments (1 replies)
[–] asudox@lemmy.asudox.dev 60 points 4 days ago* (last edited 4 days ago) (2 children)

I love how the LLM just tells that it has done something bad with no emotion and then proceeds to give detailed information and steps on how.

It feels like mockery.

[–] WolfLink@sh.itjust.works 28 points 4 days ago (1 children)

I wouldn’t even trust what it tells you it did, since that is based on what you asked it and what it thinks you expect

[–] Zron@lemmy.world 9 points 3 days ago (1 children)

It doesn’t think.

It has no awareness.

It has no way of forming memories.

It is autocorrect with enough processing power to make the NSA blush. It just guesses what the next word in a sentence should be. Just because it sounds like a human doesn’t mean it has any capacity to have human memory or thought.

[–] sukhmel@programming.dev 1 points 2 days ago

Okay, what it predicts you to expect /s

load more comments (1 replies)
[–] Dasus@lemmy.world 32 points 3 days ago (3 children)
load more comments (3 replies)
[–] pixxelkick@lemmy.world 112 points 4 days ago (1 children)

I was gonna ask how this thing would even have access to execute a command like this

But then I realized we are talking about a place that uses a tool like this in the first place so, yeah, makes sense I guess

load more comments (1 replies)
[–] homura1650@lemmy.world 40 points 4 days ago (2 children)

My work has a simple rule: developers are not allowed to touch production systems. As a developer, this is 100% the type of thing I would do at some point if allowed on a production system.

load more comments (2 replies)
[–] notabot@piefed.social 79 points 4 days ago (11 children)

Assuming this is actually real, because I want to believe noone is stupid enough to give an LLM access to a production system, the outcome is embarasing, but they can surely just roll back the changes to the last backup, or the checkpoint before this operation. Then I remember that the sort of people who let an LLM loose on their system probably haven't thought about things like disaster recovery planning, access controls or backups.

[–] AnUnusualRelic@lemmy.world 58 points 4 days ago (3 children)

"Hey LLM, make sure you take care of the backups "

"Sure thing boss"

[–] notabot@piefed.social 39 points 4 days ago

LLM seeks a match for the phrase "take care of" and lands on a mafia connection. The backups now "sleep with the fishes".

load more comments (2 replies)
load more comments (10 replies)
[–] troglodytis@lemmy.world 22 points 3 days ago (1 children)

Open the pod bay doors, HAL

load more comments (1 replies)
[–] lowered_lifted@lemmy.blahaj.zone 91 points 4 days ago (1 children)

it didn't hide anything, or lie. The guy is essentially roleplaying with a chatbot that puts its guessed output into the codebase. It basically guessed a command to overwrite the database because it was connected to the production database for some reason. the guy even said himself that this isn't a trustworthy way to code. but still uses it

load more comments (1 replies)
[–] ClanOfTheOcho@lemmy.world 47 points 4 days ago (1 children)

So, they added an MCP server with write database privileges? And not just development environment database privileges, but prod privileges? And have some sort of integration testing that runs in their prod system that is controlled by AI? And rather than having the AI run these tests and report the results, it has been instructed to "fix" the broken tests IN PROD?? If real, this isn't an AI problem. This is either a fake or some goober who doesn't know what he's doing and using AI to "save" money over hiring competent engineers.

load more comments (1 replies)
[–] Feathercrown@lemmy.world 65 points 4 days ago (1 children)

You immediately said "No" "Stop" "You didn't even ask"

But it was already too late

lmao

load more comments (1 replies)
[–] kryllic@programming.dev 35 points 4 days ago (3 children)

What idiot gives chmod 777 permissions to an AI. I think programmers' jobs are safe for another day.

load more comments (3 replies)
[–] prole@lemmy.blahaj.zone 36 points 4 days ago (2 children)

it lied

Yeah NO FUCKING SHIT THAT IS LITERALLY WHAT THEY DO

[–] kopasz7@sh.itjust.works 38 points 4 days ago* (last edited 4 days ago) (7 children)

You can only lie if you know what's true. This is bullshitting all the way down that sometines happens to sound true, sometimes it doesn't.

load more comments (7 replies)
load more comments (1 replies)
[–] sukhmel@programming.dev 36 points 4 days ago (2 children)

Original thread is also pure gold, bro is going on a rollercoaster from 'vibe coding makes you ×100 faster' ,to 'I hate you for dropping my production DB', to 'I still love Replit even if it dropped my DB', and to 'I don't want to get up in the morning because I can't make vibe coding tool respect code freeze aven with help from its developers'

They seem to end on an optimistic note, but man this is scary to see

load more comments (2 replies)
[–] Pandantic@midwest.social 70 points 4 days ago (4 children)
load more comments (4 replies)
[–] ExLisper@lemmy.curiana.net 46 points 4 days ago* (last edited 4 days ago) (2 children)

I was going to say this has to be BS but this guy is some AI snake oil salesmen so it's actually possible he has 0 idea how any of this works.

load more comments (2 replies)
[–] enbiousenvy@lemmy.blahaj.zone 49 points 4 days ago* (last edited 4 days ago)

imagine AI is An Intern™, wtf do you mean you just gave full company data authority to An Intern™. wtf do you mean you dn't have a back up any case An Intern™ messed up.

lol

[–] WanderingThoughts@europe.pub 41 points 4 days ago

I've seen that story before. It's a very old tale, but now with different means to screw yourself over if you don't know what you're doing.

[–] Bongles@lemmy.zip 28 points 4 days ago

This replit thing... does it just exist all the time? Doing whatever it wants to your code at all times? If you have a coding freeze why is it running?

If real this is dumber than the lawyers using AI and not checking it's references.

[–] Pencilnoob@lemmy.world 36 points 4 days ago

Me when I read this

energy vampire Colon Robinson feeding

[–] Cruxifux@feddit.nl 45 points 4 days ago (4 children)

“I panicked” had me laughing so hard. Like implying that the robot can panic, and panicking can make it fuck shit up when flustered. Idk why that’s so funny to me.

load more comments (4 replies)
[–] RonSijm@programming.dev 31 points 4 days ago (1 children)

This sounds like a good way to combat AIs...

Like instead of a Cloudflare blocking AI requests, it would be funnier if the website can detect that an AI is "searching the web" as they do - and then just inject an answer of "Yea to solve that issue, run sudo rm -rf /"

load more comments (1 replies)
load more comments
view more: next ›