America desperately needs new privacy laws
Title: The Privacy Crisis in Tech: Why Congress Has Failed to Act and What’s at Stake
In 1973, long before the digital age transformed our lives, the U.S. Department of Health, Education, and Welfare (HEW) issued a groundbreaking report titled “Records, Computers, and the Rights of Citizens.” The report warned that networked computers, then a nascent technology, were “destined to become the principal medium for making, storing, and using records about people.” While these systems could be a “powerful management tool,” the report cautioned that without legal safeguards, they could erode the fundamental human right to privacy—specifically, “control by an individual over the uses made of information about him.”
This foresight was not merely theoretical. In 1974, Congress passed the Privacy Act, one of the first laws aimed at regulating computerized records systems. It limited when government agencies could share information and outlined the rights individuals had to access their own data. Over the decades, additional privacy laws emerged, covering healthcare (HIPAA), children’s online privacy (COPPA), electronic communications (ECPA), and even video rentals (VPPA). Yet, as digital surveillance by governments and private companies exploded in the 21st century, Congress has repeatedly failed to keep pace.
Lawmakers have proposed numerous plans to protect Americans’ privacy, but these efforts have consistently fizzled. Attempts to update the Electronic Communications Privacy Act of 1986 to rein in government spying were thwarted by concerns about compromising police and anti-terrorism operations. Despite bipartisan efforts, Congress has not passed a comprehensive bill governing how private companies collect data or what rights individuals have over their own information. Even targeted proposals, like the Fourth Amendment Is Not for Sale Act, which sought to restrict police from bypassing privacy laws using data brokers, have failed to become law.
Meanwhile, new technologies—from augmented reality glasses to generative artificial intelligence—are creating fresh risks daily. These innovations make it easier than ever to surreptitiously surveil people or encourage the sharing of intimate information with tech platforms. Immigration agents are now harassing citizens identified through data analytics tools and facial recognition. Data breaches at major tech companies are common, and security regulations meant to prevent them are being rolled back. Amazon’s recent Super Bowl ad, which boasted about how your doorbell can become part of a distributed surveillance dragnet, exemplifies how invasive technologies are being normalized.
The stakes are high. Invasions of privacy don’t just risk exposing intimate details about individuals; they shift the balance of power toward those who hold the most data. Algorithmic pricing, for example, allows companies to set individualized prices based on personal information, leading to situations where users are charged different prices for the same item. While some companies, like Instacart, have claimed to end such experiments, the practice remains a stark reminder of how data can be weaponized.
State-level and international regulations have made some progress. The General Data Protection Regulation (GDPR) has governed companies in Europe since 2018, though recent proposals to roll it back have raised concerns. Several U.S. states have passed general privacy frameworks, as well as more specific rules. For instance, Illinois’ biometric privacy law has facilitated lawsuits against Meta, and New York recently mandated algorithmic pricing disclosure. However, privacy advocates warn that many of these rules are inadequate. A 2025 report by the Electronic Privacy Information Center (EPIC) and US PIRG Education Fund graded state consumer privacy bills, with only California and Maryland earning higher than a C.
Despite these challenges, there have been some victories. In 2024, Congress passed the Protecting Americans’ Data from Foreign Adversaries Act (PADFAA), which EPIC’s deputy director Caitriona Fitzgerald calls “the strongest privacy law to be passed at the federal level in recent years.” PADFAA bars data brokers from allowing hostile nations to access sensitive personal information of Americans, and EPIC has used it to file a complaint against Google’s real-time bidding ads system, alleging it broadcast sensitive data indiscriminately.
Yet, the overall picture remains bleak. A sense of learned helplessness has taken hold, with companies like Meta pushing the narrative that if an existing technology already poses privacy concerns, it’s unreasonable to complain about a new technology doing it even worse. Internal documents suggest Meta believes the Trump administration’s public flouting of civil liberties will keep activists distracted, allowing it to push invasive features like facial recognition into products.
The Trump administration’s actions have made the dangers of these systems harder to ignore. It’s one thing to know the government could look up personal information about you; it’s another to have ICE agents intimidate you by dropping your name. While not all privacy nightmares have easy regulatory solutions, privacy groups have long advocated for clear steps forward. A coalition including EPIC, PIRG, and others has called for creating a new independent federal Data Protection Agency, as well as a private right of action that would let individuals sue over privacy violations. The Data Justice Act, proposed by scholars at NYU Law, aims to limit state collection and use of our deep digital footprints, redefining personal data “not as information the state may freely access, but as something inherently ours.”
There’s likely no turning back the clock on many digital technologies, nor would people want to. But it’s past time for more lawmakers to take the risks these technologies create seriously and decide it’s worth fighting back.
Tags & Viral Phrases:
- ban facial recognition
- ICE agents intimidate you
- different prices for the same item
- distributed surveillance dragnet
- intimate information
- learned helplessness
- ICE agents harass citizens
- surreptitiously surveil people
- broadcast sensitive data indiscriminately
- private right of action
- Data Protection Agency
- Data Justice Act
- Protecting Americans’ Data from Foreign Adversaries Act (PADFAA)
- General Data Protection Regulation (GDPR)
- Fourth Amendment Is Not for Sale Act
- Electronic Communications Privacy Act of 1986
- Privacy Act
- Records, Computers, and the Rights of Citizens
- control by an individual over the uses made of information about him
,




Leave a Reply
Want to join the discussion?Feel free to contribute!