< Back to list

The Future of Privacy: Enforcing Privacy Standards

Published: Jul 4, 2024

Recent anti-privacy legislations and proposals in Europe, the US and Australia threaten to infringe our fundamental right to privacy and to create grave risks to the safety of children and vulnerable people. It's time we shift the focus: privacy should be a non-negotiable duty of technology providers, not just a right users must constantly fight to protect, and not something that users can be asked to consent away as a condition of access to a service.

Tech giants are trying to normalize surveillance capitalism, often with little to no consequences globally. These companies are contributing to a growing ecosystem where opting out of invasive data hoarding practices is becoming increasingly challenging, if not outright impossible. We are being gaslit by the technology executives who try to justify profiteering from AI theft, from Microsoft claiming all our content is fair game for their exploitation to unethical startups like Perplexity turning the word “privacy” into a marketable farce.

The AI Hype’s Impact on Privacy

The exaggeration of AI’s actual capabilities and the continuous promotion of its “intelligence” is creating a rat race where tech companies and well-funded startups are evading accountability, as they eagerly collect and exploit more data than ever.

They're prioritizing AI development over user privacy and rights, setting a dangerous precedent for current and future online engagements. They've already normalized the use of AI to scan and analyze supposedly private communications - from emails to instant messages - repackaging this intrusion as "productivity tools”. Meanwhile, most consumers actually want more data privacy, not less, and are increasingly concerned by the lack of it.

The legal push towards “client-side scanning”, attacks on end-to-end encryption and the support for pro-surveillance legislation gives credibility to these highly intrusive practices that literally endanger lives. And we know that moral obligations mean nothing to corporations benefiting from these exploitative models, so we have to ensure that our demands for privacy are legally enforceable and non-negotiable.

We are encouraged to see more legal pressure on companies that exploit user data on a daily basis. For example, the European Center for Digital Rights’ (Noyb) complaints against Meta’s abuse of personal data to train their AI and, and the demands from the Norwegian Consumer Council to data protection authority to ensure that applicable laws are enforced against Meta considering there is “no way to remove personal data from AI models once Meta has begun the training”.

Noyb is taking a strong stance against other companies with similar exploitative models, including facial recognition surveillance tools often misused by law enforcement agencies. Consider supporting their ongoing efforts — we strongly believe legal action is one of the most effective means to hold these companies accountable for their persistent abuses, which are otherwise shielded by heavily funded self-serving lobby groups.

We must shift from a defensive stance to a proactive one by proposing privacy legislation that puts users in direct control of their private data.

This legislation should:

  1. Establish non-negotiable provider duties for protecting user privacy, with hefty fines and consequences for service operators who do not comply.
  2. Prevent providers from circumventing these duties through user consent clauses — it should be legally prohibited to ask for a consent to share user data or to use it for anything other than providing a service.
  3. Prevent providers from asking for any more personal information from the users than technically necessary and legally required. For example, asking for a phone number as a condition of access to a service should be made illegal in most cases — it does not provide a sufficient security, exposes users' private information and allows simple aggregation of users' data across multiple services.
  4. Create a strong legal framework that cannot be resisted or modified

By codifying these principles into law, we can establish a strong technological framework that is built to create more value for end users, while protecting their privacy against data exploitation, criminal use and identity theft. We will continue the fight against illogical legislative proposals designed to normalize mass surveillance, but our efforts should equally gear towards creating and supporting new models and technological foundations that bring us far closer to the reality we urgently need.

Collective Action

There is great work being done by advocacy organizations, and service providers need to contribute to this fight as well by shifting the narrative and reclaiming the term “privacy” from the tech giants who co-opted and corrupted it. We must play a bigger role in supporting users in setting stronger boundaries, making demands, and refusing anything less than genuine privacy and data ownership, while getting comfortable with holding providers accountable for any violations.

Privacy should be seen as a fundamental obligation of technology providers, and legislators must actively enforce this expectation. The more consumers make this demand, the more pressure we put on anti-privacy lobbyists with rogue motives, the easier it will be to hold abusers accountable, and the more likely we can collectively ensure that a privacy-first web becomes a reality.

You can support privacy today by signing the petition prepared by Global Encryption Coalition in support of communication privacy. You can also write to your elected representatives, explaining them how data privacy and encrypted communications protect children safety and reduce crime.

© 2020-2024 SimpleX | Open-Source Project