this post was submitted on 10 Oct 2025
192 points (96.6% liked)

Technology

76005 readers
2077 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 35 comments
sorted by: hot top controversial new old
[–] pastermil@sh.itjust.works 6 points 1 day ago

is anyone really surprised?

[–] echodot@feddit.uk 23 points 2 days ago

Oh no, not information that's already available online, whatever will we do.

If you need AI to tell you how to build weapon system you're not going to build the weapon system anybody who's an actual threat already has this information. This is just nonsense pearl clutching to sell a story, there's nothing actually here though.

[–] CubitOom 43 points 3 days ago (1 children)

Remember kids, if you want to look up something that you don't want the government to know about, don't use the internet to do it.

Also, LLMs are not the best source for asking about how to make things that explode.

[–] einkorn@feddit.org 17 points 2 days ago (1 children)
[–] CubitOom 13 points 2 days ago

The TM 31-210 manual appeared as an "Easter egg" in the 1995 CGI animated film, Toy Story. In the scene where Woody is trapped under a blue plastic box in Sid's bedroom, it's possible to see behind him a document titled "TM 31-210 Improvised Interrogation Handbook", a clear reference to the actual document.

[–] tidderuuf@lemmy.world 49 points 3 days ago (5 children)

Like, every search engine would yield the exact same results. It doesn't mean the average person would have the means or necessary requirements to develop it.

Do these morons think that because someone uses ChatGPT it magically gives access to those materials to make a bomb?

[–] kadu@scribe.disroot.org 32 points 3 days ago (3 children)

This is actually a marketing approach.

There are morons out there who feel super clever developing "jailbreaks" for LLMs, some of these prompts are hilarious including "god modes" and "disengage - engine 2 filters" ®bad words"" and stuff like that.

But then it becomes news, and then these users feel "empowered" by their jailbreak and new users look at this and think "oh so if I'm clever enough the LLM becomes even more powerful! I'm clever, so I'm going to try it!" which is ultimately what OpenAI wants.

You can't "bypass the system prompt" because that's not how it works. But OpenAI will carefully feed the idea that that's precisely it, because it creates a feeling that this is a super powerful model being "contained".

Again, it's marketing. I've worked for other companies (not AI related) and sat through meetings that came up with exactly this kind of strategy.

[–] Semicolon@lemmy.world 21 points 3 days ago (1 children)

Or, occam's razor - AI companies are worried about PR and are implementing safeguards, but due to the nature of this technology it's very hard (or maybe even impossible) to make those safeguards robust.

Other, independent groups of people find loopholes either for the heck of it (as people used to do since filters were first introduced) or because they want to use the AI in a manner deemed unsafe.

Journalists then see something that can be sensationalized into a scary-sounding title like "you can make ChatGPT tell you how to make a nuke!!" or "you can make ChatGPT encourage suicide!!" and they run with it because it makes people click.

Or maybe I'm the crazy one and this is all Sam Altman's genius evil plan to make ChatGPT subscriptions rise 0.2% per quarter. Maybe your comment and my response are also mere cogs in this marketing machine. We will never know.

[–] jrs100000@lemmy.world 1 points 3 days ago

Yea but its not end uses being targeted, its investors.

[–] tidderuuf@lemmy.world 0 points 3 days ago

Damn that makes a lot of sense. Thx!

[–] shalafi@lemmy.world 7 points 3 days ago

I made a kilo of black powder a couple of years ago for my old-school guns. Sulfer, charcoal and stump killer is not exactly hard to come by. Neither is fertilizer and diesel fuel.

Biggest domestic terror attack in US history used a truck full of the later.

[–] treadful@lemmy.zip 3 points 2 days ago (1 children)

As much as I don't want chatbots to explain to morons how to harm people, I don't like that this just seems to be a form of censorship. If it's not illegal to publish this information, why should it be censored via a chatbot interface?

[–] echodot@feddit.uk -3 points 2 days ago* (last edited 2 days ago)

It's irrelevant anyway because the sorts of people who would want to make a bomb to harm others are not the sort of people that would be able to follow the instructions.

It is more likely than anything else that they would blow themselves up with some nitroglycerin. Even professionals used to do that back in the day because it was so unstable. I can imagine that a MAGA would be able to top 1900s scientists.

[–] Cybersteel@lemmy.world 2 points 2 days ago (1 children)

What about iron 2 oxide and aluminium powder? Seems simple enough to get.

[–] lemming741@lemmy.world 3 points 2 days ago

Spicy k-cups are available commercially

[–] artyom@piefed.social -4 points 3 days ago (1 children)

Did you actually try that?

[–] echodot@feddit.uk 2 points 2 days ago (1 children)

Lol, yeah. The anarchists handbook has been in public domain longer than most people in this thread have been alive. Yeah it's absolutely available on a search engine you could have got it on alta vista.

How do you think people figure out how to make IEDs do you think it's some secret knowledge pass down from father to son, no, they get it online or they just working out from basic principles of scientific understanding. Trying to contain knowledge never works.

[–] artyom@piefed.social -3 points 2 days ago* (last edited 2 days ago) (1 children)

I didn't ask if it was available, I asked if a typical search engine would lead you to it. Because it won't.

[–] echodot@feddit.uk 1 points 2 days ago* (last edited 2 days ago) (1 children)
[–] CodenameDarlen@lemmy.world 13 points 3 days ago (1 children)

I downloaded local Llama Uncensored and it easily teaches me how to make a home made bomb, suicide methods etc...

This isn't news anymore, anyone can have access to such things.

[–] FreedomAdvocate@lemmy.net.au 7 points 3 days ago (2 children)

You don’t even need an LLM, just an internet connected browser.

[–] echodot@feddit.uk 3 points 2 days ago (1 children)

Or literally just buy some fertiliser. We've all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it's practically a nuclear bomb level detonation.

[–] CodenameDarlen@lemmy.world 1 points 2 days ago* (last edited 2 days ago)

You don't need a browser just use cURL

[–] PixelatedSaturn@lemmy.world 13 points 3 days ago* (last edited 3 days ago) (2 children)

When I first got internet in 95, it was easy to find stuff like that. I even made a website about making explosives for my computer class. Got a good grade for it and everything. Nobody said anything. Kind of weird if I think of it now. Anyway, making explosives as a hobby is a real bad decision. Most people understand that. The ones that don't are not smart enough to make them. The ones that are smart enough and still want to make them, would not use chatgpt.

[–] RisingSwell@lemmy.dbzer0.com 4 points 2 days ago

It's really easy to make explosives. Making them stable and reliable is the hard part.

[–] ceenote@lemmy.world 5 points 3 days ago (1 children)

Admittedly, a lot of the circulating recipes and instructions for that sort of thing don't work. The infamous Anarchist's Cookbook is full of incorrect recipes. The problem might come from a LLM filtering out debunked information.

[–] PixelatedSaturn@lemmy.world 2 points 3 days ago

Id still want to double check😀.