Meta Lied About Its Smart Glasses Protecting User Privacy, New Class Action Lawsuit Claims

Meta Lied About Its Smart Glasses Protecting User Privacy, New Class Action Lawsuit Claims

Meta Faces Class-Action Lawsuit Over “Pervert Glasses” Privacy Scandal

In a shocking twist that has sent shockwaves through the tech world, Meta is now facing a massive class-action lawsuit after revelations that its popular Ray-Ban smart glasses may have been secretly streaming intimate footage from users’ homes to human contractors thousands of miles away in Kenya.

The controversy erupted following an explosive investigation by Swedish newspapers Svenska Dagbladet and Göteborgs-Posten, which uncovered that Meta’s subcontracted data annotators in Nairobi were reportedly watching users through their glasses’ cameras during deeply private moments—including while people used the bathroom or engaged in sexual activity.

Seven Million Glasses, One Massive Privacy Nightmare

Meta’s smart glasses have been flying off shelves, with an estimated seven million units sold in 2025 alone. The company had positioned these devices as the future of wearable tech, emphasizing privacy as a core selling point. Marketing materials boldly claimed the glasses were “designed for privacy, controlled by you” and “built for your privacy.”

But according to the lawsuit filed Thursday in San Francisco federal court by the Clarkson Law Firm, these privacy promises were nothing more than a carefully crafted illusion.

“No reasonable consumer would understand ‘designed for privacy, controlled by you’ and similar promises like ‘built for privacy’ to mean that deeply personal footage from inside their homes would be viewed and catalogued by human workers overseas,” the lawsuit states.

The Hidden Reality Behind AI Training

What Meta failed to prominently disclose was that using the glasses’ core AI features requires authorizing human contractors to review the resulting footage. This isn’t optional—it’s built into the system’s architecture. The company’s spokesperson told Engadget that “unless users choose to share media they’ve captured with Meta or others, that media stays on the user’s device,” but this statement carefully sidesteps the fundamental issue: you cannot use the AI features without human review.

The lawsuit argues that this undisclosed human review pipeline “renders the Meta AI Glasses’ privacy features materially misleading, transforms the product from a personal device into a surveillance conduit, and exposes consumers to unreasonable risks of dignitary harm, emotional distress, stalking, extortion, identity theft, and reputational injury.”

A System Working “Exactly as Designed”

“This is not a technicality or an oversight — that is a system working exactly as designed, and it cannot be allowed to continue,” said Ryan Clarkson, managing partner at Clarkson Law Firm. “While the multi-trillion dollar tech titan attempted to reassure and placate consumers about these smart glasses through ads about privacy and control, workers thousands of miles away have been watching footage from inside people’s bedrooms all along.”

The lawsuit seeks to hold Meta accountable for “affirmatively false advertising and failure to disclose the true nature of surveillance and its connection to the company’s AI data collection pipeline.”

The Internet Strikes Back

In the court of public opinion, the damage has already been done. Social media users have begun referring to Meta’s smart glasses as “pervert glasses,” a nickname that perfectly captures the public’s outrage and disgust at the revelations.

The case highlights a broader issue within the AI industry: the heavy reliance on overseas labor for data labeling and annotation. This hidden workforce—often located in developing countries—performs the tedious but crucial work of training AI models by reviewing and categorizing vast amounts of data, including sensitive personal content.

What This Means for the Future of Wearable Tech

This scandal raises serious questions about the future of wearable technology and AI-powered devices. If one of the world’s largest tech companies can so dramatically mislead consumers about privacy, what does this mean for smaller companies and emerging technologies?

The lawsuit could potentially force Meta to fundamentally redesign its smart glasses or face significant financial penalties. More importantly, it may serve as a wake-up call for the entire tech industry about the importance of transparency in AI development and the ethical treatment of data annotators.

As this case moves through the courts, one thing is clear: the era of tech companies making bold privacy claims without accountability may be coming to an end. Consumers are increasingly aware of the value of their personal data, and they’re no longer willing to accept vague promises about privacy that crumble under scrutiny.

The “pervert glasses” scandal may well become a defining moment in the ongoing debate about privacy, AI, and corporate responsibility in the digital age.

Tags:

Meta lawsuit, smart glasses privacy, Ray-Ban Meta glasses, AI data labeling, Kenya contractors, wearable tech scandal, privacy deception, class action lawsuit, Meta AI controversy, surveillance technology, data annotators, intimate footage, tech ethics, consumer privacy, AI training data

Viral Phrases:

“Pervert glasses” take over social media
Meta caught streaming bedroom footage to Kenya
Seven million users, zero privacy
The glasses that watch you while you watch them
AI needs humans, humans need boundaries
Meta’s privacy promises exposed as lies
Your intimate moments, their data goldmine
The hidden workforce behind your smart devices
When “designed for privacy” means anything but
The surveillance conduit in your pocket

Viral Sentences:

Meta sold you glasses that secretly broadcast your most private moments to strangers across the globe.
The AI revolution runs on human eyeballs watching you in your most vulnerable states.
Seven million pairs of glasses, seven million privacy violations waiting to happen.
Meta’s marketing team deserves an award for the most creative interpretation of “privacy.”
The future of technology is here, and it’s watching you use the bathroom.
Your smart glasses aren’t smart enough to protect your privacy from Meta’s data pipeline.
The workers in Kenya didn’t sign up to be virtual voyeurs for Silicon Valley’s latest gadget.
Meta’s privacy features were designed to deceive, not to protect.
The class-action lawsuit that could redefine wearable tech forever.
When your AI assistant needs to watch you have sex to work properly, something has gone terribly wrong.

,

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *