Instagram CEO dismisses idea of social media addiction in landmark trial | Technology
Instagram’s CEO Denies Social Media Addiction in Landmark Trial
In a courtroom showdown that’s gripping Silicon Valley, Instagram CEO Adam Mosseri took the stand Wednesday in Los Angeles to testify in a groundbreaking trial that could reshape how social media platforms operate forever.
Facing intense questioning about whether Instagram knowingly created addictive products that harm young users, Mosseri made a crucial distinction that’s sparking heated debate across the tech industry: “I think it’s important to differentiate between clinical addiction and problematic use.”
The trial centers on a 20-year-old plaintiff identified only as KGM, who alleges Instagram’s addictive design features—including the infamous endless scroll—exacerbated her depression and suicidal thoughts. But this case represents something far bigger than one individual’s story.
The Battle Lines Are Drawn
Hundreds of families and school districts have joined forces to sue Meta, Snap, TikTok, and YouTube, alleging these tech giants knowingly created products designed to hook young users at the expense of their mental health. This isn’t just another class-action lawsuit—it’s a fundamental challenge to how social media companies do business.
Mark Lanier, the plaintiffs’ attorney, didn’t mince words during opening arguments. He called these platforms “digital casinos,” pointing to internal documents that suggest companies targeted children as young as four years old. The comparison to gambling isn’t accidental—it’s the core of the plaintiffs’ argument that these platforms use the same psychological tricks as slot machines.
The Science Behind the Claims
While psychologists don’t officially classify social media addiction as a diagnosis, researchers have documented the harmful consequences of compulsive use among young people. The data is sobering: studies show correlations between heavy social media use and increased rates of anxiety, depression, and suicidal thoughts in teenagers.
What makes this trial particularly explosive are the internal communications that have emerged. In documents presented to the court, Meta researchers described Instagram as “a drug” and referred to themselves as “pushers.” One employee noted that CEO Adam Mosseri “freaked out” when dopamine was mentioned in discussions about teen engagement, yet acknowledged that the biological and psychological effects were “undeniable.”
A Father’s Heartbreaking Story
Among those watching Mosseri’s testimony was John DeMay, whose 17-year-old son Jordan died by suicide in 2022 after being targeted in an online sextortion scam. The perpetrators, using a hacked Instagram account, blackmailed Jordan after he sent nude photographs, demanding $1,000 and threatening to share the images with his friends and family.
DeMay’s presence in the courtroom underscores the human cost behind the legal arguments. “It’s absolutely a win for us already because the testimony is public, the internal documents are public,” he said before Mosseri took the stand. “Now Mr. Mosseri is going to have to go on the stand and try to justify why his company was doing the things they were doing to build products that are so addictive, and continuing to do it even though kids are dying over them.”
The Defense Strategy
Meta’s lawyers are fighting back hard, disputing the science behind social media addiction and arguing that KGM’s mental health issues stemmed from familial abuse rather than Instagram use. They’re also leaning heavily on Section 230 of the Communications Decency Act, which typically shields platforms from liability for user-generated content.
But the plaintiffs have found a clever legal workaround. Instead of focusing on harmful content posted by users, they’re targeting the platforms’ design choices—arguing that the companies’ algorithms and features are themselves defective products that cause harm.
Safety Features or Window Dressing?
Instagram has added safety features aimed at young users in recent years, but a 2025 review by Fairplay, a nonprofit advocating for children’s digital rights, found that “less than one in five are fully functional and two-thirds (64%) are either substantially ineffective or no longer exist.”
This raises uncomfortable questions about whether these features are genuine attempts to protect users or merely PR moves designed to deflect criticism while the core addictive mechanics remain unchanged.
The Stakes Couldn’t Be Higher
If the plaintiffs succeed, the financial implications could be staggering. Hundreds of millions of dollars in potential damages could force these companies to fundamentally redesign their platforms. More importantly, a verdict against the tech giants could open the floodgates for similar lawsuits nationwide.
The timing is particularly sensitive for Meta. Just months ago, Mosseri faced congressional testimony about child safety on Instagram, where he defended the platform’s efforts to protect young users. Now he’s under oath in a trial where internal documents suggest a very different story.
A Novel Legal Strategy
What makes this trial truly groundbreaking is its novel approach to holding tech companies accountable. By focusing on product design rather than content moderation, the plaintiffs have found a way around the legal protections that have shielded social media platforms for years.
The strategy is simple but potentially devastating: if social media platforms are designed to be addictive, and that addiction causes harm, then they’re defective products under consumer protection laws. It’s the same logic that has been used to hold tobacco companies accountable for designing cigarettes to be addictive.
The Industry Watches Closely
Other social media executives are undoubtedly watching this trial with bated breath. If Mosseri’s testimony and the internal documents presented become public record, it could force a reckoning across the entire industry.
The trial also comes amid growing political pressure. Lawmakers around the world are increasingly worried about social media’s addictive potential, with some calling for regulations similar to those imposed on tobacco and alcohol.
What Happens Next
The trial is expected to last several weeks, with testimony from additional witnesses and experts. The outcome could hinge on whether the jury believes that Instagram’s design features constitute a defective product or whether they view social media use as a matter of personal responsibility.
For parents like John DeMay, the trial represents a last hope for accountability. “Every time we try to get something legislatively done it’s a grind,” he said. “I’ve lost a lot of hope and I know other parents have, too.”
But he sees financial pressure as the key to change. “These companies—when they start getting sued for hundreds of millions of dollars by all these victims for the harms that they’ve been perpetrating on their users for so long—they’re going to be forced to make changes or else they’re going to go broke.”
The courtroom drama unfolding in Los Angeles may well determine whether Silicon Valley’s most powerful companies will be forced to fundamentally rethink how they build and operate their platforms. For millions of young users and their families, the stakes couldn’t be higher.
Tags & Viral Phrases
- Digital casinos
- Social media addiction
- Instagram CEO testimony
- Landmark California trial
- Addictive design features
- Endless scrolling
- Teen mental health crisis
- Silicon Valley showdown
- Mosseri under oath
- Internal Meta documents
- Dopamine and social media
- Sextortion tragedy
- Section 230 showdown
- Product liability revolution
- Big Tech accountability
- Fairplay nonprofit findings
- Bellwether trial
- Consumer protection strategy
- Tech industry reckoning
- Parental rights vs. Big Tech
- Algorithmic accountability
- Youth online safety
- Social media harm
- Platform design liability
- Digital wellbeing
- Addictive technology
- Tech regulation battle
- Online child safety
- Social media reform
- Platform responsibility
- Digital age lawsuit
- Tech ethics trial
- Social media psychology
- User engagement manipulation
- Online harm prevention
- Tech company transparency
- Social media legislation
- Digital rights advocacy
- Platform safety features
- Tech industry scrutiny
- Social media impact
- Online addiction debate
- Digital citizenship
- Tech accountability movement
- Social media consequences
- Platform design ethics
- Youth digital protection
- Tech company responsibility
- Social media reform movement
- Digital age challenges
,




Leave a Reply
Want to join the discussion?Feel free to contribute!