America desperately needs new privacy laws

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

It is Hindsighta weekly newsletter featuring an essential story from the world of technology. To learn more about the dire state of technology regulation, follow Adi Robertson. Hindsight arrives in our subscribers’ inboxes at 8 a.m. ET. Register for Hindsight here.

In 1973, well before the modern digital age, the U.S. Department of Health, Education, and Welfare (HEW) released a report titled “Records, Computers, and Citizen Rights.” Networked computers seemed “destined to become the primary means of creating, storing, and using information about people,” the report’s foreword begins. These systems could be a “powerful management tool”. But with few legal safeguards, they could erode the fundamental human right to privacy – in particular “an individual’s control over the use of information about him or her.”

These concerns are not just words in Washington. In 1974, Congress passed the Privacy Act, which set some of the first rules for computerized records systems – limiting when government agencies can share information and defining what access individuals should have. During the 20th century, privacy law was supplemented with more privacy rules in areas such as health care, children’s websites, electronic communications, and even video cassette rentals. But over the past two decades, amid an explosion of digital surveillance by governments and private companies, Congress has failed to keep pace.

Lawmakers have considered numerous plans to preserve Americans’ privacy, but time and time again they have failed. Attempts to curb government spying — such as proposed updates to the Electronic Communications Privacy Act of 1986 — have been hampered by fears that they would compromise law enforcement and counterterrorism operations. Despite multiple concerted attempts by members of both parties, Congress has failed to pass a bill governing how private companies collect data and the rights of people over their own information. Even highly targeted proposals like the Fourth Amendment Is Not For Sale Act — which prohibits police from circumventing existing privacy laws by using data brokers — have not cleared the hurdle of passing into law.

Meanwhile, new technologies, from augmented reality glasses to generative artificial intelligence, create new risks every day, making it easier than ever to surreptitiously monitor people or encourage the sharing of intimate information with technology platforms.

Immigration officers harass citizens they have identified using data analysis and facial recognition tools. Data breaches at big tech companies are common, and security regulations meant to prevent them are being rolled back. Amazon just released a Super Bowl ad touting how your doorbell can be part of a distributed surveillance network to find dogs.

At any given moment, privacy breaches not only risk revealing something intimate about you to the world, they shift the balance of power in favor of whoever has the most data. Take the example of algorithmic pricing, where companies use shoppers’ personal information to set individualized prices that they believe people will pay — leading companies like Instacart to charge users different prices for the same item. (The company said this was an experiment that has since ended.)

National and international regulations have addressed some privacy risks. European businesses have been governed by the General Data Protection Regulation (GDPR) since 2018, although a rollback was proposed late last year. Several states have adopted some form of general privacy framework, as well as more specific rules: Illinois’ biometric privacy law has made it easier to sue Meta and others, for example, and New York mandated algorithmic pricing disclosure a few months ago. However, privacy advocates warn that many of the rules are inadequate. The Electronic Privacy Information Center (EPIC) and the US PIRG Education Fund evaluated states’ consumer privacy bills in 2025, and only two states, California and Maryland, scored above a C.

EPIC Deputy Director Caitriona Fitzgerald shares The edge this Congress has recently passed at least one significant reform: the Protecting Americans’ Data from Foreign Adversaries Act of 2024, which Fitzgerald calls “the strongest privacy law ever passed at the federal level in recent years.” PADFAA prohibits data brokers from letting hostile countries access Americans’ sensitive personal information, and EPIC used it to file a complaint against Google’s real-time ad bidding system — which it claims distributes sensitive data indiscriminately.

But overall, we can say that the situation is not great.

Since the start of 2026, in many places a learned sense of helplessness over privacy has taken hold. Companies like Meta insist that if existing technology already poses privacy problems, it’s unreasonable to complain that new technology makes the situation even worse. According to internal documents, Meta also appears to believe that the Trump administration’s very public disregard for civil liberties (or what Meta euphemistically considers a “dynamic political environment”) will keep activists distracted, leaving them free to introduce invasive features like facial recognition into products.

But the administration’s actions make the dangers of these systems increasingly difficult to ignore. It’s one thing to know the government could search for personal information about you. It’s another to be intimidated by ICE agents into dropping your name.

Not all of today’s privacy nightmares have simple regulatory solutions. But privacy groups have said for years that there are obvious ways to start improving the situation. A long-standing wish list from a coalition including EPIC, PIRG and others suggests creating a new independent federal data protection agency, as well as a private right of action that would allow individuals to sue for violations of privacy laws. One of the most recent proposals is the Data Justice Act, a model piece of legislation introduced last month by a group of academics at NYU Law. It aims to limit the state’s collection and use of our deep digital footprints, with the aim of redefining personal data “not as information that the state can freely access, but as something that inherently belongs to us.”

It’s probably not possible to go back in time in many digital technologies – and, in many cases, people wouldn’t want to either. But it’s high time that more lawmakers take seriously the risks these technologies create and decide it’s worth fighting back.

  • In many ways, governments around the world are moving backwards on privacy, thanks to the rise of age restrictions online. In the United States, the Supreme Court has already authorized age verification for sites with a significant volume of adult content. Now, several states have passed laws requiring it for virtually every app on your phone, a policy the Supreme Court appears likely to consider this year.
  • Virtually all issues related to technology regulation are intertwined, so technology monopolies Also Exacerbating privacy concerns by reducing competition and concentrating information in a few places where it can be exploited. (This is another issue that Congress has addressed but failed to act on.) Plus, laws don’t work if the government doesn’t enforce them fairly, so the Trump administration’s era of gangster tech regulation must end.
  • One of the simplest rallying cries for privacy in recent years is “ban facial recognition” – typically used by government and law enforcement, but there is also a push to limit its private deployment on smart glasses.
Track topics and authors of this story to see more in your personalized homepage feed and to receive email updates.


Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button