Privacy

3135 readers
240 users here now

Icon base by Lorc under CC BY 3.0 with modifications to add a gradient

founded 2 years ago
MODERATORS
51
 
 

This is a list of phone manufacturers that lock their bootloaders to prevent people from installing custom operating systems (LineageOS etc) to remove bloatware and spyware/tracking.

52
53
54
 
 

SAN JOSE, Calif. – San Jose and its police department routinely violate the California Constitution by conducting warrantless searches of the stored records of millions of drivers’ private habits, movements, and associations, the Electronic Frontier Foundation (EFF) and American Civil Liberties Union of Northern California (ACLU-NC) argue in a lawsuit filed Tuesday.

55
 
 

Hello everyone!

Journiv is a self-hosted private journaling application that puts you in complete control of your personal reflections. Built with privacy and simplicity at its core, Journiv offers comprehensive journaling capabilities including mood tracking, prompt-based journaling, media uploads, analytics, and advanced search. All while keeping your data on your own infrastructure.

Journiv v0.1.0-beta.9 is out with

  • Markdown support
  • Inline media (images and video) with viewer.
  • Many bug fixes and improvements.

The Journey Ahead

Journiv is in active development, with a fully functional backend, a web frontend, and mobile apps launching soon. It is self-hosted, and designed to be your companion for decades.

Journiv is being built because our memories deserve to be ours, forever.

Learn More

Developer @rockstar1215@lemmy.world

56
 
 

It seems many if not all of today's AR glasses use Bluetooth, which means that if you're close enough, it should be possible to detect people who are wearing them (and/or anything else that's using Bluetooth)

57
58
 
 

Looks like Session is adding back PFS (among other things), although it'll take a while for the protocol changes to be finalized and implemented in the apps.

59
 
 

The Dutch police have secretly used controversial AI intelligence software by the American company Palantir since 2012, the Volkskrant reported, based on documents obtained through the Open Government Act after years of trying. 99 percent of the documents have been blacked out, but it is clear that caretaker Prime Minister Dick Schoof was involved in purchasing the software in 2011 as Director General of the Dutch police.

60
61
 
 

We know we said The Counterbalance was taking a two-week break, but there is too much to discuss this week, so we are back with a (bonus) edition of The Counterbalance.

62
 
 

Svensk polis använder sig av det kontroversiella AI-verktyget Palantir, rapporterar Dagens ETC.
Svenskars känsliga persondata matas in i en plattform som specialdesignats för svensk polis.

63
64
65
 
 

The Indian telecommunications authority, the Department of Telecommunications (DoT), has instructed eight messenger services to implement a permanent binding to inserted SIM cards. Affected are WhatsApp, Telegram, Signal, Snapchat, ShareChat, as well as the Indian services Arattai, JioChat, and Josh. According to the directive, the companies must ensure within 90 days that their services can only be used with a physically inserted SIM card.

66
67
68
 
 

Happy winter and merry festivities!

Last year I made a post outlining many gift ideas for privacy enthusiasts. I'm back this year with an updated list. Privacy enthusiasts, by nature, are sometimes difficult to buy gifts for. This list is here to make it easier for you to come up with ideas, even if you don't directly gift what's on the list. I've decided to make a rule this year: only physical items. You can't put a subscription under the tree.

3D printers

3D printers can turn plastic into any shape you want. While a lot of 3D printers include proprietary privacy-invasive software, there are open-source options such as RepRap. The privacy benefit of these comes in the form of homemade firearms. Traditional firearms include many elements to trace the ammunition back to the firearm, but homemade firearms (such as ones made using a 3D printer) exclude these. The reliability of the firearm depends on the quality of the 3D printer, but the designs are getting easier and easier to make.

Accessories

Especially for phones, there are a few of privacy accessories that are simple but effective.

Anonymous dress

Anonymous dress is clothing that conceals your identity in public. Obtaining these items of clothing is a chore, so it's always easiest when it is gifted by somebody else. Black, unthemed clothing does the best job of protecting privacy. The holy grail of anonymous dress is:

  • A balaclava to hide your face.
  • A baseball cap to further hide your face, although a sun hat does a better job.
  • Elevator shoes to conceal your height.
  • A hooded down jacket to hide body shape and skin color. There are significantly long down jackets that extend below the knees that can somewhat conceal your gait too. Last year I included jackets that spoof AI recognition or blind infrared cameras, but those are very difficult to find and can be very identifying.
  • Sunglasses to hide your eyes. Reflectacles do the best job of this.
  • Touchscreen gloves to prevent fingerprints and still be able to use touchscreens. Normal gloves work when paired with a capacitive stylus.
  • An umbrella to hide your clothing from surveillance cameras.

Ciphers

Not all encryption is digital. Traditionally, complex codes and ciphers were created to conceal messages. Hardware devices like the enigma machine were used to further aide the process. Modern versions of those devices, as well as related items such as invisible ink are still around and can be a fun project.

Computers

Laptops, desktops, and servers are all useful devices for accessing digital services privately. While there is no best choice, some lists can help shine some light on which hardware is considered secure:

Concealment devices

Concealment devices are things that look like ordinary objects, but in some way or another, have a hidden compartment used for storage. These are excellent ways to hide sensitive items such as cash, backup security tokens, and more. These are excellent gifts if you're giving one-on-one rather than at a party.

Cryptocurrency wallets

Cryptocurrency wallets are devices used to securely store (the keys for) cryptocurrency such as the private cryptocurrency Monero. The two best options are:

Dumb tech

Dumb tech is the opposite of smart tech. It doesn't connect to every device in your house. It doesn't broadcast that data to a corporation. It doesn't get exposed in a data breach. It doesn't get hacked. It doesn't go down when the internet goes offline. Things like dumb TVs or dumb cars are becoming harder to find but more and more valuable for privacy.

Mail

Mail is almost always sensitive. For that reason, it's useful to protect the contents by using security envelopes. For delivering packages privately, it's also useful to have a label printer capable of printing shipping labels.

Money

Banks and payment service providers are almost always incredibly privacy invasive and offer poor security. While some of these issues can be mitigated with services like Privacy, it doesn't fix the underlying issue. Anonymous payments not only protect your privacy, but protect your money too, and having the ability to make payments like these is what allows privacy to further grow. Anonymous payment methods include:

  • Cash
  • Gift cards (when purchased with cash and adequate anonymous dress)
  • Monero (which is physical when paired with a cryptocurrency wallet)
  • Stored-value card (when purchased with cash and adequate anonymous dress)

Optical discs

Optical discs are a physical way to store movies, shows, music, games, and more. The idea is that, instead of paying a subscription and streaming content, you can pay a one-time fee and get the full quality media offline. This is also excellent for ripping to create a digital archive to stream from your own servers for free.

Paper

Your most sensitive information is put at risk the moment it becomes digitized, so pen and paper isn't so bad for some uses:

  • Earlier this year, Amazon removed the option to download and transfer ebooks. It's becoming increasingly harder to "own" an ebook, especially without using privacy-invasive software. For that reason, books are much better for privacy.
  • Calendar apps are convenient for reminders, but they often sync to cloud services or include telemetry. Physical calendars are a good way to have peace of mind knowing that your personal events are away from prying eyes and can be erased without a trace.
  • Notebooks are also useful for the same reasons as books. There are also numerous benefits to writing things down instead of typing them.

Paper shredders

Paper shredders destroy sensitive documents to prevent obtaining sensitive information by digging through landfills. However, shredded documents can be recovered using automated software. The paper shredder industry hasn't discovered fire yet, it seems.

Printers

Printers suck. So much so that not even Framework wanted to make one. Nevertheless, a new printer called Open Printer is in the works. Until it's finished, the best option is to gift a printer that allows printing over a wired connection.

Promotional merchandise

There is no shortage of promotional merchandise for privacy. Some of my favorites include:

I also recently found products like this that serve a functional benefit of telling people you don't want to be recorded without explicitly talking to them.

Rayhunter

Rayhunter is a device created by the Electronic Frontier Foundation to detect Stingray attacks. It can be installed on supported devices, which are great gifts for high threat model people.

Safes

Safes are a secure box to store sensitive items. I shouldn't need to explain why this is a good idea.

Security seals

Security seals are a special type of sticker that makes it very clear if the seal has ever been broken. This is useful to place on the case of computers or other containers that shouldn't be opened often.

Security tokens

Security tokens are hardware devices used to authenticate accounts at a hardware level. When setup correctly, they are one of the most secure way to login. The most popular open source options are:

Smartphones

GrapheneOS is the most private and secure operating system available. They recently announced that they are partnering with an OEM to manufacture devices designed for GrapheneOS. However, until that device is made available, Google Pixels are still the only device GrapheneOS can be installed on.

USB flash drives

USB flash drives are the unsung heroes for so many areas of privacy. Whether it be installing operating systems such as Qubes OS and Tails, or creating offline Seedvault backups for GrapheneOS, USB flash drives have a multitude of uses. Just remember: it's better to have many, smaller USB flash drives than one, large USB flash drive.

Wi-Fi hotspots

Wi-Fi hotspots are (for privacy use-cases) hardware devices that allow connecting devices to the cellular network in a much more private way. The best one that supports an excellent privacy organization is the Calyx Internet Membership.

Wired headphones

Wired headphones not only provide higher quality audio output, but they also avoid the history of security issues with Bluetooth and the surveillance capitalism that comes with Bluetooth Low Energy beacons. Which type of wired headphones you gift depends on a lot of factors, but one that pairs nicely with Google Pixels are the Pixel USB-C earbuds sold by Google themselves.

Wireless routers

Wireless routers often leak everything sent through them. For that reason, custom software such as OpenWrt was designed to replace the privacy invasive software preinstalled on routers. OpenWrt also created their own router called the OpenWrt One. Earlier this year, they announced that they would be creating a new router called the OpenWrt Two. It hasn't come out yet, but maybe it will be on the list next year.

Conclusion

There is no shortage of privacy tech. The same technology that empowers privacy is the thin veil slowing down the world from its dystopian target. Giving the gift of privacy means giving the gift of a better future for those of us fighting on the front lines.

Lack-of-AI notice

I’ve been burned before, so I always try to mention that none of my content is AI generated. It isn’t even AI assisted. Just because something is comprehensive and well-structured does not make it AI generated. Every word I write is my own. Thank you for your understanding.

OC by @Charger8232@lemmy.ml

69
 
 

Alright so yesterday I wanted to try to find a federated alternative to Signal to use with my girlfriend following the ChatControl announcements.

In the risk analysis, providers must check whether their services can be misused for the dissemination of abuse material or for contacting children. There are to be three categories for this: high, medium, and low risk. Providers in the highest category could be obliged to participate in the development of risk mitigation technologies.

We can anticipate Signal and co. will be part of the high risk category and assume the risk mitigation tech will be aimed towards breaking encryption. Especially since this new agency will be interfacing with Europol. I don’t see this as a win or even a draw.

https://feddit.org/comment/10169945

The first I tried was DeltaChat. To be honest, it's a quite good app

  • onboarding is really fast
  • UI is close to Whatsapp / Signal
  • Encryption is hidden from the user but still enabled by default

The one small issue I had was with the calls not appearing on my Android device when the phone is locked, but the call feature is still in beta so I guess down the line they'll improve it.

In the meantime, I wanted to give other apps a try. I started with Simplex

  • the chats rooms always have a lot going on. Every time someone connects to a room, it shows up in the room summary, it's quite distracting.
  • from people over there in one of the public Simplex chatrooms, there is a battery drain issue due to the P2P model, which I indeed noted and is kind of a deal breaker as I want to use that app with my gf which has a phone with already a bad battery
  • people also mentioned that that there are issues with how groups are working

I then went to XMPP

  • I was using https://movim.eu/ on my browser, then Dino on an app, and https://f-droid.org/packages/de.monocles.chat/ on my phone, Monal on the other phone
  • There were a few shenanigans with OMEMO keys being created in my browser but not accessible from my phone, preventing me from reading some messages from the phone while I could see them from the browser. That would have definitely an issue down the line as I wouldn't want to get my relatives to face those issues by themselves.
  • One issue I noted is that creating private groups is basically creating a chatroom on the server. Monocles will hide that and just create the room with a random name, but movim.eu was asking for a name for the room, that needed to be unique for the server. Seems like there was an initiative to simplify that, but the last update was in 2020: https://wiki.xmpp.org/web/Easy_Group_Chats
  • Also, even though calls would show up on the Android device when locked, when accepting the calls they would not go through and not connect, making this another non working option.

Then I tried NextCloud Talk as I have a NextCloud space on a non-profit FOSS services provider that grants that to members for 10€ per year. Turned out okay, but it's more of a Slack/Teams replacement that Signal/Whatsapp, you can't call people directly, you can just create calls in discussions (the use case is similar to Slack huddles). Once again, those calls would not show up on the locked Android device.

Finally, I tried Matrix, which I'm quite familiar with, but I discovered that

  • Fluffychat, while the Matrix.org says that it supports 1:1 calls, actually does not at the moment ( https://matrix.org/ecosystem/clients/fluffychat/ - https://github.com/krille-chan/fluffychat/issues/2329)
  • Element X does not allow to use SSO to sign-on, so I couldn't use https://tchncs.de/matrix/
  • I registered another account on another server found on https://servers.joinmatrix.org/ , but then got an error that the server does not allows calls to be made (I think it was something like this issue, maybe a different error message, I don't remember exactly: https://github.com/element-hq/element-x-android/issues/4528)
  • Then I decided to go back to Element Classic on the iPhone to be able to use my tchncs.de accounts. I then tried to call from the iPhone to the Android, but turns out you can't do calls across different Element versions, it needs to be either Element X - Element X or Element Classic - Element Classic
  • Element Classic gave me back the old verification process, where you have to verify emojis at some point otherwise they will keep showing you the warnings, I had forgotten how confusing that can be. I even had the case when I had the other session in another client (think FlufflyChat / Element Classic), but I literally could not verify as Element Classic would not show up the emoji page when I was accepting on Fluffychat.
  • I then finally installed Element Classic on my Android phone, it works fine, we can make calls both ways, they show up on locked screen and actually work.

It's a bit frustrating to see that the transition from Element Classic to Element X still creates all those issues. I will keep using DeltaChat for the text chat, and just use Element for the call features.

Quite a ride, hopefully once DeltaChat is done with their beta call feature it will fit my use case.

70
71
72
 
 

Whonix 18.0 is now available as a major release upgrade. However, if you haven’t heard of it, let me start with a brief introduction.

It’s a Debian-based distro, targeting privacy-focused individuals who require maximum anonymity protections, designed around a security architecture that separates all activities into two virtual machines: a Gateway VM that routes all traffic exclusively through the Tor network, and a Workstation VM that has no direct internet access and can only communicate through the Gateway.

This design isolates applications from the networking stack, preventing IP leaks even if software inside the Workstation is compromised. Now, back to the novelties in the new release.

73
 
 

european funds recovery initiative Search Search... Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR HOME Related News

Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR Last week we at EFRI wrote about the Digital Omnibus leak and warned that the European Commission was preparing a stealth attack on the GDPR

Since then, two things have happened:

The Commission has now officially published its Digital Omnibus proposal.

noyb (Max Schrems’ organisation) has released a detailed legal analysis and new campaigning material that confirms our worst fears: this is not harmless “simplification”, it is a deregulation package that cuts into the core of the GDPR and ePrivacy.

What noyb has now put on the table

On 19 November 2025, noyb published a new piece with the blunt headline: “Digital Omnibus: EU Commission wants to wreck core GDPR principles

Here’s a focused summary of the four core points from noyb’s announcement, in plain language:

New GDPR loophole via “pseudonyms” and IDs

The Commission wants to narrow the definition of “personal data” so that much data under pseudonyms or random IDs (ad-tech, data brokers, etc.) might no longer fall under the GDPR.

This would mean a shift from an objective test (“can a person be identified, directly or indirectly?”) to a subjective test (“does this company currently want or claim to be able to identify someone?”).

Therefore, whether the GDPR applies would depend on what a company says about its own capabilities and intentions.

Different companies handling the same dataset could fall inside or outside the GDPR.

For users and authorities, it becomes almost impossible to know ex ante whether the GDPR applies – endless arguments over a company’s “true intentions”.

Schrems’ analogy: it’s like a gun law that only applies if the gun owner admits he can handle the gun and intends to shoot – obviously absurd as a regulatory concept.

arzh-CNnlenfrdeitptrues european funds recovery initiative Search Search... Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR HOME Related News

Digital Omnibus: How Big Tech Lobbying Is Gutting the GDPR Last week we at EFRI wrote about the Digital Omnibus leak and warned that the European Commission was preparing a stealth attack on the GDPR

Since then, two things have happened:

The Commission has now officially published its Digital Omnibus proposal.

noyb (Max Schrems’ organisation) has released a detailed legal analysis and new campaigning material that confirms our worst fears: this is not harmless “simplification”, it is a deregulation package that cuts into the core of the GDPR and ePrivacy.

What noyb has now put on the table On 19 November 2025, noyb published a new piece with the blunt headline: “Digital Omnibus: EU Commission wants to wreck core GDPR principles”

Here’s a focused summary of the four core points from noyb’s announcement, in plain language:

New GDPR loophole via “pseudonyms” and IDs The Commission wants to narrow the definition of “personal data” so that much data under pseudonyms or random IDs (ad-tech, data brokers, etc.) might no longer fall under the GDPR.

This would mean a shift from an objective test (“can a person be identified, directly or indirectly?”) to a subjective test (“does this company currently want or claim to be able to identify someone?”).

Therefore, whether the GDPR applies would depend on what a company says about its own capabilities and intentions.

Different companies handling the same dataset could fall inside or outside the GDPR.

For users and authorities, it becomes almost impossible to know ex ante whether the GDPR applies – endless arguments over a company’s “true intentions”.

Schrems’ analogy: it’s like a gun law that only applies if the gun owner admits he can handle the gun and intends to shoot – obviously absurd as a regulatory concept.

Weakening ePrivacy protection for data on your device

Today, Article 5(3) ePrivacy protects against remote access to data on your devices (PCs, smartphones, etc.) – based on the Charter right to the confidentiality of communications.

The Commission now wants to add broad “white-listed” exceptions for access to terminal equipment, including “aggregated statistics” and “security purposes”.

Max Schrems finds the wording of the new rule to be extremely permissive and could effectively allow extensive remote scanning or “searches” of user devices,ces as long as they are framed as minimal “security” or “statistics” operations – undermining the current strong protection against device-level snooping.

Opening the door for AI training on EU personal data (Meta, Google, etc.)

Despite clear public resistance (only a tiny minority wants Meta to use their data for AI), the Commission wants to allow Big Tech to train AI on highly personal data, e.g. 15+ years of social-media history.

Schrems’ core argument:

People were told their data is for “connecting” or advertising – now it is fed into opaque AI models, enabling those systems to infer intimate details and manipulate users.

The main beneficiaries are US Big Tech firms building base models from Europeans’ personal data.

The Commission relies on an opt-out approach, but in practice:

Companies often don’t know which specific users’ data are in a training dataset.

Users don’t know which companies are training on their data.

Realistically, people would need to send thousands of opt-outs per year – impossible.

Schrems calls this opt-out a “fig leaf” to cover fundamentally unlawful processing.

On top of training, the proposal would also privilege the “operation” of AI systems as a legal basis – effectively a wildcard: processing that would be illegal under normal GDPR rules becomes legal if it’s done “for AI”. Resulting in an inversion of normal logic: riskier technology (AI) gets lower, not higher, legal standards.

Cutting user rights back to almost zero – driven by German demands

The starting point for this attack on user rights is a debate in Germany about people using GDPR access rights in employment disputes, for example to prove unpaid overtime. The German government chose to label such use as “abuse” and pushed in Brussels for sharp limits on these rights. The Commission has now taken over this line of argument and proposes to restrict the GDPR access right to situations where it is exercised for “data protection purposes” only.

In practice, this would mean that employees could be refused access to their own working-time records in labour disputes. Journalists and researchers could be blocked from using access rights to obtain internal documents and data that are crucial for investigative work. Consumers who want to challenge and correct wrong credit scores in order to obtain better loan conditions could be told that their request is “not a data-protection purpose” and therefore can be rejected.

This approach directly contradicts both CJEU case law and Article 8(2) of the Charter of Fundamental Rights. The Court has repeatedly confirmed that data-subject rights may be exercised for any purpose, including litigation and gathering evidence against a company. As Max Schrems points out, there is no evidence of widespread abuse of GDPR rights by citizens; what we actually see in practice is widespread non-compliance by companies. Cutting back user rights in this situation shifts the balance even further in favour of controllers and demonstrates how detached the Commission has become from the day-to-day reality of users trying to defend themselves.

EFRI’s take: when Big Tech lobbying becomes lawmaking

For EFRI, the message is clear: the Commission has decided that instead of forcing Big Tech and financial intermediaries to finally comply with the GDPR, it is easier to move the goalposts and rewrite the rules in their favour. The result is a quiet but very real redistribution of power – away from citizens, victims, workers and journalists, and towards those who already control the data and the infrastructure. If this package goes through in anything like its current form, it will confirm that well-organised corporate lobbying can systematically erode even the EU’s flagship fundamental-rights legislation. That makes it all the more important for consumer organisations, victim groups and digital-rights advocates to push back – loudly, publicly and with concrete case stories – before the interests of Big Tech are permanently written into EU law.

74
 
 

Hello!

I've been following the discourse about the recent ChatControl update that has passed few days ago, and I have been wondering if it changes anything for the majority of people who were ok with the first version from 2021.

First a disclaimer - I'm vehemently against it, because it does affect me since I do use the alternative services affected, and I'm not trying to downplay the impact. I know that it's an issue for people already invested in privacy, but this question focuses on general population and services that reportedly already do the scanning anyway.

At least based on information on this website, most of the commonly popular services have been doing ChatControl since 2021:

Currently a regulation (that passed in 2021) is in place allowing providers to scan communications voluntarily (so-called “Chat Control 1.0”). So far only some unencrypted US communications services such as GMail, Facebook/Instagram Messenger, Skype, Snapchat, iCloud email and X-Box apply chat control voluntarily (more details here). As a result of the mandatory Chat Control 2.0 proposal, the Commission expected a 3.5-fold increase in scanning reports (by 354%).

My first question is - is this correct? I have not seen it mentioned anywhere else, not even a single comment in any discussion about the new resolution, and I don't want to spread false information. It sounds like an important fact that more people should be aware of, but everyone seemed to conviniently forget right after the first Chatcontrol passed in 2021, and the first round of trying to pass the second one (in 2023 or whenever) failed. If anyone has more information about the current state, I'd love to hear it.

Assuming that's correct, then my question/rant is - what does change for people who are already using these services exclusively? People like that had the last 5 years to do something about the serious privacy violation like this - stop using services that do the scanning. Most of them did not do that, forcing people like me to choose between privacy and being able to contact my friends, because "they don't want to install a new chatting app, and everyone is on Messenger anyway". And I'm pretty sure that they wouldn't stop even if the new resolution did not pass.

I realize it sounds more than a rant that a question, because it kind of is, it has been frustrating screaming about ChatControl to deaf ears for the past few years, but I'm also honestly asking what actually changes. Even though I am frustrated, I still want to have actual arguments, so when I'm convincing people to stop using those services, I'm not lying that "nothing changes for you if you don't switch" (assuming the current resolution does not get finalized and implemented). Plus, since people are now actually listening about ChatControl, telling them that it's already happening does have a greater impact.

75
 
 

Hollowing out some of the world’s strongest digital safeguards will harm us all. But there is still time to change course

view more: ‹ prev next ›