How about don't let ai read your anything.
this post was submitted on 03 Apr 2026
14 points (81.8% liked)
Security
2039 readers
4 users here now
A community for discussion about cybersecurity, hacking, cybersecurity news, exploits, bounties etc.
Rules :
- All instance-wide rules apply.
- Keep it totally legal.
- Remember the human, be civil.
- Be helpful, don't be rude.
Icon base by Delapouite under CC BY 3.0 with modifications to add a gradient
founded 2 years ago
MODERATORS
I'd go further: have a separate user-account, on your own machine, whose entire filesystem is separate from the rest of your machine, for using LLMs in.
NO access to your normal accounts, your email, your browser-history or bookmarks, NOTHING except that-account's stuff.
Machiavellianism is intrinsic to the companies which produce them, we need to be presuming it to be intrinsic to the LLM's, too, as for some of them it is intrinsic, & we don't know which ones, yet.
Zero-Trust.
_ /\ _
Just sandbox it instead.
Instead, upload .env poisons.