this post was submitted on 03 Apr 2026
14 points (81.8% liked)

Security

2039 readers
3 users here now

A community for discussion about cybersecurity, hacking, cybersecurity news, exploits, bounties etc.

Rules :

  1. All instance-wide rules apply.
  2. Keep it totally legal.
  3. Remember the human, be civil.
  4. Be helpful, don't be rude.

Icon base by Delapouite under CC BY 3.0 with modifications to add a gradient

founded 2 years ago
MODERATORS
 

AI coding assistants like Claude Code, Cursor, and GitHub Copilot are becoming part of our daily workflow. They read our files, understand our codebase, and help us write code faster. But there's a problem - they can also read your .env files.

you are viewing a single comment's thread
view the rest of the comments
[–] Paragone@lemmy.world 3 points 6 days ago

I'd go further: have a separate user-account, on your own machine, whose entire filesystem is separate from the rest of your machine, for using LLMs in.

NO access to your normal accounts, your email, your browser-history or bookmarks, NOTHING except that-account's stuff.

Machiavellianism is intrinsic to the companies which produce them, we need to be presuming it to be intrinsic to the LLM's, too, as for some of them it is intrinsic, & we don't know which ones, yet.

Zero-Trust.

_ /\ _