On this week’s episode of Based Behavior, I broke down the massive social media addiction trial happening right now in Los Angeles. Meta and YouTube are on trial for engineering addiction in children. Here’s why every parent needs to pay attention.
There’s a trial happening right now in a Los Angeles courtroom that should be front-page news. Meta and YouTube are being sued for deliberately designing their platforms to addict children. And the evidence coming out? It’s damning.
This isn’t one angry parent filing a lawsuit. We’re talking about over 2,200 cases brought by families, school districts, and 42 state attorneys general. TikTok and Snapchat were originally defendants too, but they quietly settled last month before the trial started. When companies settle that quickly, it usually means they don’t want a jury to see the evidence.
The first case being heard involves a 20-year-old woman identified as K.G.M., who started using Instagram and YouTube at age 10. According to the lawsuit, the platforms’ design features – infinite scroll, autoplay videos, “like” buttons, push notifications, beauty filters – created an addiction that contributed to severe anxiety, depression, body dysmorphia, self-harm, and suicidal ideation.
Her lawyers aren’t arguing this was an unfortunate side effect. They’re arguing Meta and YouTube knew exactly what they were doing. That these features were engineered specifically to create addictive behavior. On purpose.
The Testimony Has Been Revealing
Instagram CEO Adam Mosseri took the stand this past Wednesday, and things got uncomfortable fast.
In a 2020 podcast interview, Mosseri said, “There’s such a thing as being addicted to a social media platform.” Pretty clear statement, right?
But under oath, in front of a jury? Suddenly he “misspoke.” He wasn’t “careful with his words.” He insisted there’s a meaningful difference between “clinical addiction” and “problematic use.”
So which is it? Either people get addicted to Instagram, or they don’t. You can’t admit it in casual conversation and then claim it was imprecise language when you’re under oath facing a lawsuit.
The Internal Documents Are Worse
The real story is in the emails that are coming out during discovery.
In 2019 and 2020, Meta’s Vice President of Product Design, Margaret Gould Stewart, pushed internally to ban beauty filters – the ones that let users digitally alter their faces. She wanted them removed because the company’s own research showed they were causing body dysmorphia in young users, particularly teenage girls.
Her concern was overridden by another executive, John Hegeman, who argued that removing the filters would “limit our ability to be competitive in Asian markets.”
Read that again. They had evidence the feature was psychologically harming children. And they kept it anyway because removing it might hurt their market position.
When the beauty filter issue reached Mark Zuckerberg, his response was that he was “concerned about whether we have good enough data that this represents real harm.”
They had data showing harm. Zuckerberg wanted more data before acting. How many kids needed to suffer before the evidence would be “good enough”?
“We’re Basically Pushers”
Perhaps the most damning piece of evidence is an internal email from an Instagram employee describing their own product: “People are binging on IG so much they can’t feel the reward anymore. We’re basically pushers.”
Pushers. As in drug dealers.
And it’s not hyperbole. The lawsuit documents how Meta and YouTube deliberately borrowed design techniques from slot machines and studied neuroscience research on addiction. They hired experts specifically to figure out how to maximize dopamine responses. They engineered these platforms to be as addictive as possible, then deployed them on children’s developing brains.
Meta’s Defense Misses the Point
Meta’s lawyers argue that K.G.M. had pre-existing issues. She came from a difficult home situation and was seeing therapists before she ever created an Instagram account.
Maybe that’s true. But it’s irrelevant.
If you hand a struggling child a platform specifically designed to keep them scrolling for hours, engineered to make them compare themselves to algorithmically-curated impossible beauty standards, programmed to feed them an endless stream of anxiety-inducing content – you’re not neutral. You’re making everything worse.
That’s exactly what the plaintiffs are arguing. And it’s why Meta is so desperate to shift blame to anything other than their product design.
We’ve Seen This Before
In 1994, the CEOs of America’s seven largest tobacco companies testified before Congress. One by one, they were asked under oath whether they believed nicotine was addictive.
One by one, they all said the same thing: “I believe nicotine is not addictive.”
It became one of the most infamous moments in corporate history. Because internal documents later proved they absolutely knew cigarettes were addictive. A 1963 Brown & Williamson memo explicitly stated they were “selling nicotine, an addictive drug.” They had research on addiction. They manipulated nicotine levels to increase dependency. And then they lied about it for decades.
The result? A $250 billion settlement. Complete destruction of the industry’s credibility. Permanent changes to how tobacco is regulated.
The Parallels Are Undeniable
Big Tech is following the exact same playbook Big Tobacco used:
Engineer addiction. Tobacco companies manipulated nicotine levels. Tech companies hire neuroscientists to maximize dopamine hits.
Study the harm. Tobacco companies had internal research on cancer and addiction. Tech companies have internal research on mental health impacts and compulsive behavior.
Hide the evidence. Tobacco companies buried their research. Tech companies keep their research internal and fight transparency.
Deny under oath. Tobacco CEOs lied to Congress. Tech executives are now parsing language in courtrooms about the difference between “addiction” and “problematic use.”
Prioritize profits over safety. Tobacco companies chose sales over health. Tech companies choose engagement metrics and market share over children’s wellbeing.
History is repeating itself. The only question is whether we’ll respond the same way.
This Isn’t Anti-Technology
I’m not arguing we should ban social media or confiscate every teenager’s phone. Technology isn’t inherently evil, and social platforms can have legitimate benefits.
But when companies deliberately engineer addiction, when internal emails show employees comparing themselves to drug pushers, when executives override safety concerns because removing harmful features might hurt their competitive position in certain markets – that crosses a line.
These companies don’t deserve the benefit of the doubt. They deserve accountability.
What’s at Stake
This trial is being watched closely because it’s a bellwether case. The outcome will influence how thousands of similar lawsuits proceed.
If the jury finds Meta and YouTube liable, it could force fundamental changes to how these platforms operate. It could lead to massive financial settlements similar to Big Tobacco. It could finally give parents and regulators the tools to protect children from predatory design.
If the companies win? It sends a message that engineering addiction in children is legally acceptable as long as you’re a tech company. That internal research showing harm can be ignored as long as it’s profitable to do so.
As a parent, I know which outcome I’m hoping for. And I’ll be watching this trial closely.
The question isn’t whether these platforms can be addictive. The internal documents already prove the companies knew they were. The question is whether we’re going to hold them accountable for what they’ve done with that knowledge.
What do you think? Should social media companies be held liable for engineering addiction in children? Drop your thoughts in the comments.
Listen to this week’s full episode of Based Behavior on Spotify or Apple Podcasts.
